From 3bf2da862d86fd0e755df731205d722a19e32dfd Mon Sep 17 00:00:00 2001
From: Harheem Kim <49297157+harheem@users.noreply.github.com>
Date: Wed, 22 Jan 2025 00:29:35 +0900
Subject: [PATCH 1/6] [E-4] 17-LangGraph / 03-Use-Cases /
08-Hierarchical-Multi-Agent-Teams
---
.../08-Hierarchical-Multi-Agent-Teams.ipynb | 2354 +++++++++++++++++
1 file changed, 2354 insertions(+)
create mode 100644 17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb
diff --git a/17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb b/17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb
new file mode 100644
index 000000000..0688920a7
--- /dev/null
+++ b/17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb
@@ -0,0 +1,2354 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "px55EK-o1vpS"
+ },
+ "source": [
+ "# Hierarchical Multi-Agent Teams\n",
+ "\n",
+ "- Author: [Harheem Kim](https://github.com/harheem)\n",
+ "- Design:\n",
+ "- Peer Review:\n",
+ "- This is a part of [LangChain Open Tutorial](https://github.com/LangChain-OpenTutorial/LangChain-OpenTutorial)\n",
+ "\n",
+ "[](https://colab.research.google.com/github/LangChain-OpenTutorial/LangChain-OpenTutorial/blob/main/17-LangGraph/08-Hierarchical-Multi-Agent-Teams.ipynb) [](https://github.com/LangChain-OpenTutorial/LangChain-OpenTutorial/blob/main/17-LangGraph/08-Hierarchical-Multi-Agent-Teams.ipynb)\n",
+ "\n",
+ "## Overview\n",
+ "In this tutorial, we'll explore how to build a **Hierarchical Agent Team**.\n",
+ "\n",
+ "We'll implement a hierarchical structure to break down complex tasks that are difficult to handle with a single agent or single-level supervisor. In this structure, each lower-level supervisor manages worker agents specialized in their respective domains.\n",
+ "\n",
+ "This hierarchical approach helps efficiently solve complex tasks that would be overwhelming for a single worker or when there are too many workers to manage directly.\n",
+ "\n",
+ "This example implements ideas from the [AutoGen paper](https://arxiv.org/abs/2308.08155) using `LangGraph`, demonstrating how to organize two distinct teams for web research and document writing, managed through top and mid-level supervisors to oversee the entire process.\n",
+ "\n",
+ "### Why Choose a **Hierarchical Agent Team**?\n",
+ "\n",
+ "In our previous Supervisor example, we looked at how a single supervisor node assigns tasks to multiple worker nodes and consolidates their results. While this approach works well for simple cases, a hierarchical structure might be necessary in the following situations:\n",
+ "\n",
+ "- **Increased Task Complexity**: A single supervisor may not be able to handle specialized knowledge required across various sub-domains simultaneously.\n",
+ "- **Growing Number of Workers**: When managing many workers, having a single supervisor directly command all workers can become overwhelming.\n",
+ "\n",
+ "In such scenarios, we can create a hierarchical structure where higher-level supervisors delegate tasks to lower-level sub-supervisors, and each sub-supervisor then redistributes these tasks to their specialized worker teams.\n",
+ "\n",
+ "### Table of Contents\n",
+ "\n",
+ "- [Overview](#overview)\n",
+ "- [Environment Setup](#environment-setup)\n",
+ "- [Building Tools](#building-tools)\n",
+ "- [Implementing Utility Functions for Multiple Agents](#implementing-utility-functions-for-multiple-agents)\n",
+ "- [Defining Agent Teams](#defining-agent-teams)\n",
+ "- [Structuring a Super-Graph](#structuring-a-super-graph)\n",
+ "\n",
+ "### References\n",
+ "- [AutoGen paper: Enabling Next-Gen LLM Applications via Multi-Agent Conversation (Wu et al.)](https://arxiv.org/abs/2308.08155)\n",
+ "- [LangGraph - Multi-agent Systems](https://langchain-ai.github.io/langgraph/concepts/multi_agent/)\n",
+ "\n",
+ "----"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "_lDPNYcj3B3J"
+ },
+ "source": [
+ "## Environment Setup\n",
+ "\n",
+ "Set up the environment. You may refer to [Environment Setup](https://wikidocs.net/257836) for more details.\n",
+ "\n",
+ "**[Note]**\n",
+ "- `langchain-opentutorial` is a package that provides a set of easy-to-use environment setup, useful functions and utilities for tutorials.\n",
+ "- You can checkout the [`langchain-opentutorial`](https://github.com/LangChain-OpenTutorial/langchain-opentutorial-pypi) for more details."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 1,
+ "metadata": {
+ "id": "dFiZS1vM3GB_"
+ },
+ "outputs": [],
+ "source": [
+ "%%capture --no-stderr\n",
+ "%pip install langchain-opentutorial"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 2,
+ "metadata": {
+ "id": "X5qTujC_3Hbs"
+ },
+ "outputs": [],
+ "source": [
+ "# Install required packages\n",
+ "from langchain_opentutorial import package\n",
+ "\n",
+ "package.install(\n",
+ " [\n",
+ " \"langchain_community\",\n",
+ " \"langchain_openai\",\n",
+ " \"langchain_core\",\n",
+ " \"langchain_experimental\",\n",
+ " \"langgraph\",\n",
+ " ],\n",
+ " verbose=False,\n",
+ " upgrade=False,\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "3l40HsEA5FDq"
+ },
+ "source": [
+ "`LangChain` provides built-in tools that make it easy to use the `Tavily` search engine as a tool in your applications.\n",
+ "\n",
+ "To use `Tavily Search`, you'll need to obtain an API key.\n",
+ "\n",
+ "Click [here](https://app.tavily.com/sign-in) to sign up on the `Tavily` website and get your `Tavily Search` API key."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 3,
+ "metadata": {
+ "id": "cgmm0kl61s0l"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "Environment variables have been set successfully.\n"
+ ]
+ }
+ ],
+ "source": [
+ "# Set environment variables\n",
+ "from langchain_opentutorial import set_env\n",
+ "\n",
+ "set_env(\n",
+ " {\n",
+ " \"OPENAI_API_KEY\": \"\",\n",
+ " \"LANGCHAIN_API_KEY\": \"\",\n",
+ " \"TAVILY_API_KEY\": \"\",\n",
+ " \"LANGCHAIN_TRACING_V2\": \"true\",\n",
+ " \"LANGCHAIN_ENDPOINT\": \"https://api.smith.langchain.com\",\n",
+ " \"LANGCHAIN_PROJECT\": \"Hierarchical-Multi-Agent-Teams\",\n",
+ " }\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "J9gU0Tur5Q1b"
+ },
+ "source": [
+ "You can alternatively set API keys in a `.env` file and load it.\n",
+ "\n",
+ "[Note] This is not necessary if you've already set API keys in previous steps."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "ML63b6Tu5QUO"
+ },
+ "outputs": [],
+ "source": [
+ "from dotenv import load_dotenv\n",
+ "\n",
+ "# Load API key information\n",
+ "load_dotenv(override=True)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "blKsR-yc5VET"
+ },
+ "source": [
+ "## Building Tools\n",
+ "\n",
+ "Each team consists of one or more agents, and each agent is equipped with one or more tools. Below, we'll define all the tools that will be used by various teams.\n",
+ "Let's first look at the research team.\n",
+ "\n",
+ "### ResearchTeam Tools\n",
+ "\n",
+ "The ResearchTeam can use search engines and URL scrapers to find information on the web. You can freely add additional features below to enhance the ResearchTeam's performance."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 4,
+ "metadata": {
+ "id": "4vT-Mlb45XxH"
+ },
+ "outputs": [
+ {
+ "name": "stderr",
+ "output_type": "stream",
+ "text": [
+ "USER_AGENT environment variable not set, consider setting it to identify your requests.\n"
+ ]
+ }
+ ],
+ "source": [
+ "from typing import List\n",
+ "from langchain_community.document_loaders import WebBaseLoader\n",
+ "from langchain_community.tools.tavily_search import TavilySearchResults\n",
+ "from langchain_core.tools import tool\n",
+ "\n",
+ "# Define search tool (TavilySearch)\n",
+ "# Create a search tool instance that returns up to 6 results\n",
+ "tavily_tool = TavilySearchResults(k=6)\n",
+ "\n",
+ "\n",
+ "# Define tool for scraping detailed information from web pages\n",
+ "@tool\n",
+ "def scrape_webpages(urls: List[str]) -> str:\n",
+ " \"\"\"Use requests and bs4 to scrape the provided web pages for detailed information.\"\"\"\n",
+ " # Load web pages using the provided URL list\n",
+ " loader = WebBaseLoader(\n",
+ " web_path=urls,\n",
+ " header_template={\n",
+ " \"User-Agent\": \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/102.0.0.0 Safari/537.36\",\n",
+ " },\n",
+ " )\n",
+ " docs = loader.load()\n",
+ "\n",
+ " # Create a string containing titles and content of loaded documents\n",
+ " return \"\\n\\n\".join(\n",
+ " [\n",
+ " f'\\n{doc.page_content}\\n'\n",
+ " for doc in docs\n",
+ " ]\n",
+ " )"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "KGva1WZ95fn9"
+ },
+ "source": [
+ "### Document Writing Team Tools\n",
+ "\n",
+ "Next, we'll define the tools (file access tools) that the document writing team will use.\n",
+ "These tools allow agents to access the file system, which may not be secure. Therefore, caution is needed when using them."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 5,
+ "metadata": {
+ "id": "vzJ8v99U5gb3"
+ },
+ "outputs": [],
+ "source": [
+ "from pathlib import Path\n",
+ "from typing import Dict, Optional, List\n",
+ "from typing_extensions import Annotated\n",
+ "\n",
+ "# Create temporary directory and set working directory\n",
+ "WORKING_DIRECTORY = Path(\"./tmp\")\n",
+ "\n",
+ "# Create tmp folder if it doesn't exist\n",
+ "WORKING_DIRECTORY.mkdir(exist_ok=True)\n",
+ "\n",
+ "\n",
+ "# Create and save outline\n",
+ "@tool\n",
+ "def create_outline(\n",
+ " points: Annotated[List[str], \"List of main points or sections.\"],\n",
+ " file_name: Annotated[str, \"File path to save the outline.\"],\n",
+ ") -> Annotated[str, \"Path of the saved outline file.\"]:\n",
+ " \"\"\"Create and save an outline.\"\"\"\n",
+ " with (WORKING_DIRECTORY / file_name).open(\"w\") as file:\n",
+ " for i, point in enumerate(points):\n",
+ " file.write(f\"{i + 1}. {point}\\n\")\n",
+ " return f\"Outline saved to {file_name}\"\n",
+ "\n",
+ "\n",
+ "# Read document\n",
+ "@tool\n",
+ "def read_document(\n",
+ " file_name: Annotated[str, \"File path to read the document.\"],\n",
+ " start: Annotated[Optional[int], \"The start line. Default is 0\"] = None,\n",
+ " end: Annotated[Optional[int], \"The end line. Default is None\"] = None,\n",
+ ") -> str:\n",
+ " \"\"\"Read the specified document.\"\"\"\n",
+ " with (WORKING_DIRECTORY / file_name).open(\"r\") as file:\n",
+ " lines = file.readlines()\n",
+ " if start is not None:\n",
+ " start = 0\n",
+ " return \"\\n\".join(lines[start:end])\n",
+ "\n",
+ "\n",
+ "# Write and save document\n",
+ "@tool\n",
+ "def write_document(\n",
+ " content: Annotated[str, \"Text content to be written into the document.\"],\n",
+ " file_name: Annotated[str, \"File path to save the document.\"],\n",
+ ") -> Annotated[str, \"Path of the saved document file.\"]:\n",
+ " \"\"\"Create and save a text document.\"\"\"\n",
+ " with (WORKING_DIRECTORY / file_name).open(\"w\") as file:\n",
+ " file.write(content)\n",
+ " return f\"Document saved to {file_name}\"\n",
+ "\n",
+ "\n",
+ "# Edit document\n",
+ "@tool\n",
+ "def edit_document(\n",
+ " file_name: Annotated[str, \"File path of the document to be edited.\"],\n",
+ " inserts: Annotated[\n",
+ " Dict[int, str],\n",
+ " \"Dictionary where key is the line number (1-indexed) and value is the text to be inserted at that line.\",\n",
+ " ],\n",
+ ") -> Annotated[str, \"File path of the edited document.\"]:\n",
+ " \"\"\"Edit a document by inserting text at specific line numbers.\"\"\"\n",
+ " with (WORKING_DIRECTORY / file_name).open(\"r\") as file:\n",
+ " lines = file.readlines()\n",
+ "\n",
+ " # Process insertions in order\n",
+ " sorted_inserts = sorted(inserts.items())\n",
+ "\n",
+ " # Insert text at specified line numbers\n",
+ " for line_number, text in sorted_inserts:\n",
+ " if 1 <= line_number <= len(lines) + 1:\n",
+ " lines.insert(line_number - 1, text + \"\\n\")\n",
+ " else:\n",
+ " return f\"Error: Line number {line_number} is out of range.\"\n",
+ "\n",
+ " # Save edited document to file\n",
+ " with (WORKING_DIRECTORY / file_name).open(\"w\") as file:\n",
+ " file.writelines(lines)\n",
+ "\n",
+ " return f\"Document edited and saved to {file_name}\""
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "PFEdTyhh5ia9"
+ },
+ "source": [
+ "Finally, let's define the code execution tool, `PythonREPLTool`:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 6,
+ "metadata": {
+ "id": "LJ15Z5Lh5ktg"
+ },
+ "outputs": [],
+ "source": [
+ "from langchain_experimental.tools import PythonREPLTool\n",
+ "\n",
+ "# PythonREPL tool\n",
+ "python_repl_tool = PythonREPLTool()"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "7Ke9MtiHAJHE"
+ },
+ "source": [
+ "## Implementing Utility Functions for Multiple Agents\n",
+ "Here's how we create utility functions to streamline our tasks.\n",
+ "\n",
+ "We'll use the `functools.partial` function from our previous tutorial to create agent nodes, specifically for:\n",
+ "\n",
+ "1. Creating `worker agents`\n",
+ "2. Creating `supervisors` for `sub-graphs`"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 7,
+ "metadata": {
+ "id": "m-YR2KhBAR8Y"
+ },
+ "outputs": [],
+ "source": [
+ "from langgraph.graph import START, END\n",
+ "from langchain_core.messages import HumanMessage\n",
+ "from langchain_openai.chat_models import ChatOpenAI\n",
+ "\n",
+ "\n",
+ "# Agent Factory Class\n",
+ "class AgentFactory:\n",
+ " def __init__(self, model_name):\n",
+ " self.llm = ChatOpenAI(model=model_name, temperature=0)\n",
+ "\n",
+ " def create_agent_node(self, agent, name: str):\n",
+ " # Node creation function\n",
+ " def agent_node(state):\n",
+ " result = agent.invoke(state)\n",
+ " return {\n",
+ " \"messages\": [\n",
+ " HumanMessage(content=result[\"messages\"][-1].content, name=name)\n",
+ " ]\n",
+ " }\n",
+ "\n",
+ " return agent_node\n",
+ "\n",
+ "\n",
+ "# Initialize LLM\n",
+ "MODEL_NAME = \"gpt-4o\"\n",
+ "llm = ChatOpenAI(model=MODEL_NAME, temperature=0)\n",
+ "\n",
+ "# Create Agent Factory instance\n",
+ "agent_factory = AgentFactory(MODEL_NAME)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "jKkQjxIWAhH7"
+ },
+ "source": [
+ "Here's an example of creating an agent node using the `AgentFactory`. Let's look at how to create a search agent:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 8,
+ "metadata": {
+ "id": "f1UeOGUMAjPz"
+ },
+ "outputs": [],
+ "source": [
+ "from langgraph.prebuilt import create_react_agent\n",
+ "\n",
+ "# Define agent\n",
+ "search_agent = create_react_agent(llm, tools=[tavily_tool])\n",
+ "# Create agent node\n",
+ "search_node = agent_factory.create_agent_node(search_agent, name=\"Searcher\")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "R5euVn9JAlBU"
+ },
+ "source": [
+ "Next is the function for creating a Team Supervisor:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 9,
+ "metadata": {
+ "id": "_XlLBl1HAmSQ"
+ },
+ "outputs": [],
+ "source": [
+ "from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder\n",
+ "from langchain_openai import ChatOpenAI\n",
+ "from pydantic import BaseModel\n",
+ "from typing import Literal\n",
+ "\n",
+ "\n",
+ "def create_team_supervisor(model_name, system_prompt, members) -> str:\n",
+ " # Define list of options for next worker\n",
+ " options_for_next = [\"FINISH\"] + members\n",
+ "\n",
+ " # Define response model for worker selection\n",
+ " class RouteResponse(BaseModel):\n",
+ " next: Literal[*options_for_next]\n",
+ "\n",
+ " # Create ChatPromptTemplate\n",
+ " prompt = ChatPromptTemplate.from_messages(\n",
+ " [\n",
+ " (\"system\", system_prompt),\n",
+ " MessagesPlaceholder(variable_name=\"messages\"),\n",
+ " (\n",
+ " \"system\",\n",
+ " \"Given the conversation above, who should act next? \"\n",
+ " \"Or should we FINISH? Select one of: {options}\",\n",
+ " ),\n",
+ " ]\n",
+ " ).partial(options=str(options_for_next))\n",
+ "\n",
+ " # Initialize LLM\n",
+ " llm = ChatOpenAI(model=model_name, temperature=0)\n",
+ "\n",
+ " # Combine prompt and LLM to create chain\n",
+ " supervisor_chain = prompt | llm.with_structured_output(RouteResponse)\n",
+ "\n",
+ " return supervisor_chain"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "0IxOUbt1ApgQ"
+ },
+ "source": [
+ "## Defining Agent Teams\n",
+ "Let's define the Research Team and Doc Writing Team."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "Pp0WVu8DQ25b"
+ },
+ "source": [
+ "### Research Team\n",
+ "\n",
+ "The research team has two worker nodes: a `search agent` and a `research_agent` responsible for `web scraping`. Let's create these and set up their team supervisor:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 10,
+ "metadata": {
+ "id": "sUHIgF8ZAt4n"
+ },
+ "outputs": [],
+ "source": [
+ "import operator\n",
+ "from typing import List, TypedDict\n",
+ "from typing_extensions import Annotated\n",
+ "\n",
+ "from langchain_core.messages import BaseMessage, HumanMessage\n",
+ "from langchain_openai.chat_models import ChatOpenAI\n",
+ "from langgraph.prebuilt import create_react_agent\n",
+ "\n",
+ "\n",
+ "# Define state\n",
+ "class ResearchState(TypedDict):\n",
+ " messages: Annotated[List[BaseMessage], operator.add] # Messages\n",
+ " team_members: List[str] # List of member agents\n",
+ " next: str # Instructions for Supervisor agent to select next worker\n",
+ "\n",
+ "\n",
+ "# Initialize LLM\n",
+ "llm = ChatOpenAI(model=MODEL_NAME, temperature=0)\n",
+ "\n",
+ "# Create search node\n",
+ "search_agent = create_react_agent(llm, tools=[tavily_tool])\n",
+ "search_node = agent_factory.create_agent_node(search_agent, name=\"Searcher\")\n",
+ "\n",
+ "# Create web scraping node\n",
+ "web_scraping_agent = create_react_agent(llm, tools=[scrape_webpages])\n",
+ "web_scraping_node = agent_factory.create_agent_node(\n",
+ " web_scraping_agent, name=\"WebScraper\"\n",
+ ")\n",
+ "\n",
+ "# Create Supervisor agent\n",
+ "supervisor_agent = create_team_supervisor(\n",
+ " MODEL_NAME,\n",
+ " \"You are a supervisor tasked with managing a conversation between the\"\n",
+ " \" following workers: Search, WebScraper. Given the following user request,\"\n",
+ " \" respond with the worker to act next. Each worker will perform a\"\n",
+ " \" task and respond with their results and status. When finished,\"\n",
+ " \" respond with FINISH.\",\n",
+ " [\"Searcher\", \"WebScraper\"],\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "GdnxPsM1A4Jr"
+ },
+ "source": [
+ "Finally, let's define a function to select the next node for routing:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 11,
+ "metadata": {
+ "id": "e-JnX3jvA5Us"
+ },
+ "outputs": [],
+ "source": [
+ "def get_next_node(x):\n",
+ " return x[\"next\"]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "2pgcindlKv_c"
+ },
+ "source": [
+ "### Creating Research Team Graph\n",
+ "\n",
+ "Creates a workflow where a supervisor coordinates web search and scraping tasks."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 12,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "from IPython.display import Image, display\n",
+ "from langgraph.graph import StateGraph\n",
+ "from langgraph.graph.state import CompiledStateGraph\n",
+ "from langgraph.checkpoint.memory import MemorySaver\n",
+ "from dataclasses import dataclass\n",
+ "\n",
+ "\n",
+ "# Define node styles\n",
+ "@dataclass\n",
+ "class NodeStyles:\n",
+ " default: str = (\n",
+ " \"fill:#45C4B0, fill-opacity:0.3, color:#23260F, stroke:#45C4B0, stroke-width:1px, font-weight:bold, line-height:1.2\"\n",
+ " )\n",
+ " first: str = (\n",
+ " \"fill:#45C4B0, fill-opacity:0.1, color:#23260F, stroke:#45C4B0, stroke-width:1px, font-weight:normal, font-style:italic, stroke-dasharray:2,2\"\n",
+ " )\n",
+ " last: str = (\n",
+ " \"fill:#45C4B0, fill-opacity:1, color:#000000, stroke:#45C4B0, stroke-width:1px, font-weight:normal, font-style:italic, stroke-dasharray:2,2\"\n",
+ " )\n",
+ "\n",
+ "\n",
+ "def visualize_graph(graph, xray=False):\n",
+ " try:\n",
+ " if isinstance(graph, CompiledStateGraph):\n",
+ " display(\n",
+ " Image(\n",
+ " graph.get_graph(xray=xray).draw_mermaid_png(\n",
+ " background_color=\"white\",\n",
+ " node_colors=NodeStyles(),\n",
+ " )\n",
+ " )\n",
+ " )\n",
+ " except Exception as e:\n",
+ " print(f\"[ERROR] Visualize Graph Error: {e}\")"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 13,
+ "metadata": {
+ "id": "qcwMe2OoKwc3"
+ },
+ "outputs": [
+ {
+ "data": {
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAAtwAAAIJCAIAAAAZOH2iAAAAAXNSR0IArs4c6QAAIABJREFUeJzs3WdYFNffBuCzvbAsvYMgCFIFQQQbVuxdiMZuMHaNGpNoYokaa4wtxiQaey9oNNhjoqKiIFKk9yK9w/b6fpj8ia8BRGQ5W373lQ/LMjv7SHR49syZMySlUokAAAAAAHAj4w4AAAAAAICglAAAAABAXUApAQAAAIBagFICAAAAALUApQQAAAAAagFKCQAAAADUAhV3AABAu6mWiGKqy7h0eqlIyJNLJ1o76VGoV4tz1PzxJJsubDLlWXVpJxbHmsXB/VMEAGADpQQAzSaQS68U55SJBPM7e+YI6vOFDZ1IXL5cypdJqyQiEYWq/o8rxQJ9Cv1OWUGhsGFPt356FFo6r8ZN3xj3jxYA0NFIsHgaABqqUMizYrDjaityBPU9jMwNaQzcidrNtvQXBjT6OteeuIMAADoUlBIANNK14pwHlUVrXf1xB1GVfEGDk55BGq86wMgSdxYAQAeBia4AaB6JQqFESIsbCULInq0vUyo4FNrq5KcyhQJ3HABAR4CREgA0zA+ZcXPs3XTn3229VCJRyLvqG+EOAgBQORgpAUCTfJX0ZLy1o+40EoQQl0Y3ZbCuleSI5DLcWQAAqgUjJQBoDJlSWScV406Bh0gu250V/4NXX9xBAAAqBKUEAM0QW1shlst0+SyGWC6nkEgWTDbuIAAAVYHTNwBogNdC3tXiLF1uJAghBoXCl0uFcBIHAO0FpQQADSBTKOY5eOJOgZ9ALtuSFoM7BQBAVaCUAKABTBgsPSoNdwr8rJh67lzjbF4d7iAAAJWAUgKAujtTmH6lOLsj35HP58U+f/wheygrLUpLTWy/RP8aZmHvoKevij0DALCDUgKAuntUUeTDNemwt1MqlRNH9Pz7/s027yE97dX4Yf7Fr/PbNde/bpbmqWjPAAC8oJQAoO52ePa2ZXfc2MDrgrzammrPbn7v+0KFQkFczZealKBQKNqwh1ZKrq9Jrq9S0c4BABhBKQFA3ZFIqvp3mvzq5cI5EwYEOI0d6vfrgR0IoUd/3w4Z3QshtGHN4gAvy7MnfyG2LCzIW7Py0yF9XPt0t50woueBPd8Rz3/9+adTJw68cf1iyMjAvr52DfV1e3eu37ZxFUJozBDfAC9LVZzE8eQay2EtAwC0ERV3AABAS+6UFbyqq5zj4N7ue66sKFs6b0oXF9fV63ampyYqFQqEkJuHz+ChY549fbj7p1MIIYfOzgih3JyMeTPHWtvaf7VuB51O/3H35udPHixZsRYhlJudUV5e+vD+7fVbfqytreIaGI4PnfH4wV0zS+v5S75CCHVxbv/kvUysTOjMdt8tAAA7KCUAqDWRQsah0VWx5+RXsXxe/ey5y3v3GzR89CTiSTNzy9q6Gld3Lx/fAOIZhULx7eolxsZmvxy7ymKxEUIH9nzX1d0LISSTyQoLcpxc3LftPkyhUIjtbe06l5UWDx8T2riHdlcg4OXy63oYWaho/wAAXOD0DQBqbZyV42RbZ1Xs2dHJjUwm/7h7U0pS3JvPp6e8cnXr1vhl1JO/0lITP5m/gmgkfD6vMD+H2OB1Ya5UKg2ZPLuxkSCEsrNSJVKJq0c3pDKvhQ0JdZWq2z8AABcoJQCoNb5M2iCVqGLPdvad9/1yXioRz/l4xPbNXyoUCmLuCK+hztXDu3Gzxw/vUSiUQUPHEF9mpCUplUpipCQ3Ox0h5OHV/c3dpiUnIoS6unqpIjPBhqUHwyQAaCUoJQCotZSG6hMFaSraec9eQWd/fzhybOjViydjox8jhNJSiErx7+qxrwtyzS2tabR/lm6LefaITCY7d/VACOVkZVCpVDt7pzf3mZaaaGJqbmZuqaLMCCFHPQNvA1PV7R8AgAuUEgDUmpOegYruDCyRiBFCdBp92MhJCCGpRIoQyslMQQiZvlEpaFQ67X+TWgQC/h9XzxkYGjOZLGKkxNauc2NfIWRnppqZW6kicKM7ZQUFggaVvgUAAAuY6AqAWjOmM7d59q5v7zM49XW100MHjxwdYm1rf/bEL9a2nbz9eiKE9PS5CKEfd2/y9PK1trX37dGrm2/PJ5F/3rx+ycnF9eDeLeVlJUQjQQjl5mQ6dnF9a88cPW5Swt9nT/5Co9GDBg6zsLRp3+QIoZiasvFWndt9twAA7GCkBAB1Vy8V82XS9t1nbW11Z0fni+eO/bT3u85OXX88dElPTx8hNHrclG4+Pf64cnb/DxvraqsQQpOnzR019qMftq9dsWg6g8GeOnO+SCQszM+Vy+UFedmdnd6ehPvJghXmltY/7fnu5JEflYr2X02kViqeZOPEoMAHKgC0EEkJaxABoN7SG6rPFGZ81sUHdxC1QCGRDGkM3CkAACoBnzYAUHdd9Y3t9bjFQr41S6/JDfi8+jHBPZr8lm0n+9cFTdyDpv+gYRu2/NjeSZvw097vwi+c/O/zXC63vr6+yZes27x74JDRTX5LJJcdzU9d5+rf3jEBAGoBRkoA0Az1MolUoWjyWwqForT4ddMvI5NQU+dQWGy2kXFHXMBSV1vD5zUxKZVEavbgY2hswmY3Xb/OFqZ35RgOtbBv75gAALUApQQAzXCrNN+UwXTUM8AdBBuFUskgU7iqWd8WAKAOYKIrAJphhKX9qcI0XnvPeNUgGbxaFsxvBUCrwUgJABpDqVRWS8SIhDsHDr/lJo+0dPAyMMEdBACgQlBKANAwP+e8mmjtRCXr0DBniYhvx+IYwEU3AGg7HTquAaAdpnfquiHlOe4UHaRA0BBRmufJNYFGAoAugJESADSSXKmMq63gUmkmDBbuLKoikEmP5KV82dWXTaG1YnMAgMaDUgKApqoQC/dnJwwws/ExMMOdpT3lC+ojK4sn27pwqDQOFeoIADqE8u233+LOAABoCz0qbZCZLZVENqGzfstNiq2tMKIxDOmMLF5trqBBj0pjkilZ/LocTXgcV1vxoqbMmM4wZjDPv8501zfubmhGJ1Nw/4wBAB0K5pQAoNk6sfXpZPISJ+9BZrZGdKYhjVEhEb6oKRPJ5VwaPZNX+7iyqF0eX02LD//jevvuk3hsQKNXSoR6VLo1k8Oh0L5y8Rth6YD75woAwABO3wAAWuXixYu5ublfffUV7iAAAK0FIyUAAAAAUAtQSgAAAACgFqCUAABahcViGRoa4k4BANBmUEoAAK0iFApra2txpwAAaDMoJQCAVqFSqXQ63KEXAKBCUEoAAK0ik8kkEgnuFAAAbQalBADQKjQaTU9PD3cKAIA2g1ICAGgVqVTK5/NxpwAAaDMoJQCAVmGxWEZGRrhTAAC0GZQSAECrCIXCmpoa3CkAANoMSgkAAAAA1AKUEgBAq1CpVCaTiTsFAECbQSkBALSKTCYTiUS4UwAAtBmUEgBAq9BoNBaLhTsFAECbQSkBALSKVCoVCoW4UwAAtBmUEgAAAACoBSglAIBWYTKZXC4XdwoAgDaDUgIAaBWRSFRfX487BQBAm0EpAQAAAIBagFICAGgVWGYeAKBqUEoAAK0Cy8wDAFQNSgkAAAAA1AKUEgBAq7BYLGNjY9wpAADaDEoJAKBVhEJhdXU17hQAAG0GpQQAAAAAagFKCQCgVVgslqGhIe4UAABtBqUEANAqQqGwtrYWdwoAgDaDUgIAAAAAtQClBADQKlQqlU6n404BANBmUEoAAK0ik8kkEgnuFAAAbQalBADQKnCXYACAqkEpAQC0CtwlGACgalBKAAAAAKAWoJQAAFqFRqOxWCzcKQAA2gxKCQCgVaRSqVAoxJ0CAKDNoJQAAFqFxWIZGRnhTgEA0GZQSgAArSIUCmtqanCnAABoMyglAIBWgZESAICqQSkBALQKjJQAAFQNSgkAoFXodDqHw8GdAgCgzUhKpRJ3BgCA+ho/frxcLkcICQQCuVyur6+PEFIqlREREbijAQC0DRV3AACAWnN2dv7rr79IJBLxJY/HQwj5+vrizgUA0EJw+gYA0JKwsDATE5M3nzEwMPj444/xJQIAaC0oJQCAlri6unp7ezee51UqlQ4ODgMHDsSdCwCghaCUAADe4ZNPPrGwsCAeGxoaTp06FXciAIB2glICAHgHNzc3Pz8/pVKpVCrt7OwGDx6MOxEAQDtBKQEAvNuUKVMsLS05HA4MkwAAVAeuvgFA7RQKG0qEfBnuGP+PlYldcP+ysjJOj25Pq0txp/kXCSFzBsuOxaGTKbizAAA+FKxTAoAaeVRZFF6UXSUVdeUYVUtEuONoAD0KtUjEp5LIwyw6hdh0wR0HAPBBYKQEAHXxsKIovDh7sq0z5X+LgoDWu1GSK5DLZnZyxR0EANB2MKcEALUQU1N2sShzqp0LNJK2GWXVOYNXe7koC3cQAEDbQSkBQC1cKsoaY+mAO4VmG23pcLe8QKKQ4w4CAGgjKCUA4CdRyJPqqwzpTNxBNJ5MoSgQNOBOAQBoIyglAOBXKhI6sQ1xp9AGtmxOiUiAOwUAoI2glACAH4mEamVi3Cm0gUAmV8KcHAA0FpQSAAAAAKgFKCUAAAAAUAtQSgAAAACgFqCUAAAAAEAtQCkBAAAAgFqAUgIAAAAAtQClBAAAAABqAUoJAAAAANQClBIAAAAAqAUoJQAAAABQC1BKAAAAAKAWqLgDAADaQiaVRpz+LfLm1crSYj2ugad/7+nL13ANjXHnQgih2sqKDXM/cvbyWbJ5D+4sAABNAiMlAGiksz/uvHxoX1VZibG5Ja+uNvqv2xQKBXeof1SVl1aVlWS+iscdBACgYWCkBADNo1Ao/r5+CSG0+Vi4nZOLgFef+Oyxnr4B7lz/cHL3+mLPYVNLa9xBAAAaBkoJAJpHIZfLpRIyhWJoYoYQYnO4gUNGEt+6df74mX3bx82aH7pgBUJIyOd/OsSPa2xy8MaT2sqKJWP66RsadurinvHqJYvF8gzoO2Xx58ZmlsRrc9OTL/68OyPxJYlEdunWPXTBis5dPRr36Rc0RMCrz05JZDJZ/ceG/HHycNCoifPWbiVe++vmNZE3r077bLWjm+fmBdMRQp2cXbee/B0h9Gf4uVvnj1WVl5mYWwSNnjRu1nyEkFKpvHH26N+/X6gqKzEwMg0cOirk06U0OgMh9OkQfyG/YdychY9v/F5TVb715O+2js74ftgAgI4Dp28A0DxUGs25m69CLv9+5byyosL3em1Dba1MKg4YNJTOYj+988eWxbMkIiFCKDMpftP8aa+eP7F2cLK0c0h89njzgmn5mamNL4x99GdDTXXg4JEDxoYOC51JoVKj/74tFgoQQgJe/fO/bjHZ7P6jJ3IMjNx7BDa+Kinm6fFdG+uqK316BTHZnKqyYuL50/u2nT/wfX1NjauPv1QmuXH6twPrPn8z5x8nD3Xt3sOte4CZlc0H/8AAAJoBRkoA0Ejzvt6yc+WnOamvVn88akjI1Ilhi1l6+q15oamVzbpfziCEJGLRhrmTC7PSo/++23fEuOM7N0rFosWbfugVPAoh9NfvF47u2HDltwMrdvxEvNDM2nbT0Ut0Jov40n/A0Gd/3ox5cK/viHFPbkdIRKIhk6ayOVw2hztj+ddrpo8lNivMzkAI9Rw4nBhTEQkECKHy4sK7F0/RmczvTlyxsLFrqK1ZPX1M7KM/s5ITunh4Ey+ctXLd4IlTVPPDAwCoKRgpAUAjWdjZbzt9ffLCz6k0+q1zx9eHfSTg81rzQjL5n3/1dAYzYPAIhFB2cmJlaVF+ZiqFSs1NTTq7f8fZ/TsKstIQQtkpiY0v7N5nYGMjQQgFh0xFCEXevIYQenD9EkJoaMj0/76dV0BfCpX6+Pa147s2lhUVMtlshFByTJRSqfTp1d/Cxg4hpG9o5NtvEEIoPS6m8YUBQ0Z82E8IAKB5oJQAoKnoDOaYmZ/uOH/DrkvXkvzc34/89L570DcwQggJ+PW1VZUIIblMdvPcMeK/P8PPIYTodGbjxiw2+83XdvXu0cnZNSU2KvrvO/mZqd0C+1o7OP73LWw7d/ly92FzG7s/w899MXn478d+Jk4hIYSICTEEA0NjhBCvoa7xGSZb733/OAAATQenbwDQbMZmFqHzl+/+YmF6QmzjQIhCqWzNa6tKixBCbD0uS4+DEDI0NTvwR2Tr33rIpI+Pbt9weMtahNCw0JnNbebh32vH2RuRN68e37X58qF93r366RsaIoTqa6sat6murEAIcQyMWv/uAADtAyMlAGikJ7evxz99KJPJlEplbloScRIEIcQ1MkEI5aUlE5tF/Rnx1gvlMplCoUAIVZQUPbr5O0Koa3d/q06dDUxMaysr7l4+Q2xWV11VWpDXcoY+Q8ewOVwhv8Gyk0O3Xv2a26z0dQGFQhkwJsSrZ2+EUNnrAnffAIRQ/NNHVeUlCKHqitIXD+4hhNz9Aj74BwMA0GAwUgKA5lEqleGHfywvLqTSaGQKRSISkSmUEVNmI4S6+vhR6YxX0U+++ngUQqgoL/ut11aVlayYNISjb1CUnyOTiDu7efboP4RMJk9euPLQd1+f/GHz3UunWHqc4rxsT//ejbNcm8RgsYNGT7h9/sTQkGkkEqnJbUpfF3w5ebiTpw/X0CjxWSSVznBy72ZmbTtgbOiD65fWTB/v0NUtLz1VwKsPHDKSuAIZAKCzYKQEAM2jUCiCQ6fZO7tRqFSWHsenz4ANh855+PdCCBmbWS7Z9IO1vWNZ8WsKjTZz5TdvvZbJ1mOx2EV5WRyuwZBJU9fsP0qlUhFCQaMmLtu6r7ObZ1VJcWF2pqWtQ7eAZgc/Gg2ZNI3N4fYdOaG5DeQyqYd/r/yMlKSYpw4u7qt2/WxmbYsQmv3Fholzl7A5nPT4FywWe8zMefPXb2+Pnw0AQIORlK079wwAUJ1CIe/r5KjFjl4qfRdi8TRzG7vdl++p9I0wulyUPda6c5AJLCYLgEaCkRIAAAAAqAUoJQAAAABQCzDRFQBdYWhqdjoqDXeKjsPj8RITE1++fBkfH19ZWfn777/jTgQAeAcoJQAArfL69esLf0ZGRUUVFxfX19dXVVXJ5fLmLg4CAKgVKCUAAK1y/Pjx+qiXUqm0sYiQyWQajYY7FwDg3WBOCQBAq/Tv319fX/+toREymXz79u3c3Fx8uQAA7wYjJQBgFhsbe+pWhGigL+4gWqJfv35jndzXrVtXWFjYWE2YTGZkZORvv/1WXFzs+gYXFxfceQEA/4JSAkCHUigUZDI5Li7u6NGjQUFBoaGhFRUVAQEBNxgwbNluvLy8rl69unz58ufPnxMr8d+/f5/4llgsTktLS0tLi4uLO3fuXFZW1psdpWvXrsRScgAALGDxNABUTigUslis5OTk77//3tfXd9myZTExMVKptGfPnsSvwI5ZPE0XvLV42oEDB8LDw+vq6l6+fNnk9gqFIu0N6enpDg4Orq6ubm5uRE1hMplNvhAAoArwmQAAlaisrDQ1Nc3MzFy9erWPj8+6devodPrnn3/u5eWFEPL398cdUCcsWbLE29t7w4YNzW1AJpPd3d3d3d0bn8nKykpLS0tNTb13715aWpqlpWXXrl0bO4q+vn5HZQdAF8FICQDtJicnx9HRsaSk5JNPPunWrduOHTtKS0tFIpGDg0PLL4SRkvbS7svM5+Xlpaenp6amEkMpXC63a9eujad7TExM2uuNAABQSgD4IHK5PDExsXv37jweb8SIEf7+/rt3725oaBAKhebm5q3fD5SS9nKhIGOCXRfV3fumqKgoPT298XQPmUwm2ombm1vXrl0tLS1V9L4A6AgoJQC8n4aGhtjY2ICAABaL1a9fPy8vr4MHD8pkMolEwmaz27ZPKCXt5UxOStzhk/P7Dp4xY0YHvF1FRQXRTlJTU9PT00UiETFbljjdY2dn1wEZANAmUEoAeLeKiornz5/7+/tbWFhMmzbN0tJy69atDAajvfZfIuL/kBkXYtOlvXaos26X5Qeb2NLzi/39/SMiImJjYz/55JMOKwe1tbXEbFnidI+VlZVcLm+8rsfJyaljYgCguaCUANC06urqe/fuubq6ent7r1mzhk6nr1y50sDAQEVvNzYqYmUXHyYF5p5/kB8y4372GWDKYBEn127cuEGj0UaMGHHjxg1TU9OAgICODMPn89+8rqegoKDxXA9RUzoyDAAaAUoJAP/i8/nnzp2TyWQLFiy4ceNGcnLyzJkzO2aiwO7MOBM6s6u+UQe8l7YqFwueVpVu9+z932/FxcUdPnx45syZgYGBKSkpb15u02GkUmnjuZ6ysrLnz58Tc2YbL+2BJVIAgFICdJ1AIDh06FB5efnWrVvz8/Nv3rw5cOBAV1fXjk8S8vzWJ/ZuxnRYGKMtFErlwZxXu7v1NWc0O7NHJpNRqdTNmzc/fPjwjz/+YDKZeG/U19hRiAfEEim+vr729vawRArQTVBKgC4qKyv77bffGhoatm/fXl5efufOnYCAAOwrjtcLBJ++etSTa8IhU01pdKWSxOFw8EZSf2SEKiWiGqnoeknuCb9gS2ar5hrX1NSw2WwymTxmzJjQ0NCwsDDVJ303YomUsrKyp0+fElNSGsdR3Nzc2jyNGgANAqUE6ASFQqFQKL799tv6+vr9+/enpaWlpKQEBATY2NjgjoYQQhMmTJDJZHK5nEqlSnt6ye2tlGQSMjGkkCmGRoa40zVBIBAwGUwyBf/S+MY0Bp1M9TEwmW3fljMyFRUVT58+HTduXEpKyp9//hkSEmJtrarLid9Xbm5u4zhKamqqqampm5sbMY7i5uYGhRVoJSglQGuJRCImk7l169Z79+5FRETQaLT79+/7+/ubmprijvY2Pz+//55HsLOz+/bbb729vTGFasmGDRv8/f1Hjx6NO0i7kUgk586dq6+vX7p0aWRkpIWFBfaRs7cUFBSkpqYWFRVFR0enpqYaGxu7ubm5u7u7ubnBOArQGlBKgPZQKBQCgYDD4ezatSs8PPzChQudOnWKjo52dXXlcrm4073DW71ET08vLCxs5syZWEM1Kzc3VyaTOTs74w6iEs+ePdu3b9+GDRtcXV0LCgo6deqEO1ETiI6SkpJCDPuZmZkRHcXT09PFxYXFYuEOCEBbQCkBmk0sFsvlcjabvXHjxoiIiIsXL3bu3DkpKcnFxYVOp+NO9x769u0rEomIx0qlMigoaM+ePbhD6TSxWMxgMJYtW1ZSUnL8+HE9PT3ciVqSn59PdBQ+n3/nzh1LS0tiJgoxlAJzZoGmgFICNA+Px+PxeJaWlnv27Ll06VJ4eLiVlVV6erqGLvxw/vz5n376KTQ09Pjx42QymThxc/z4cdWtifLhSktLz549u3LlStxBOkJOTo65uTmHw1m2bFlQUFBISAjuRO+Wl5eXmppK1JTU1FRra2s3NzcPDw+ipsC1x0BtQSkBmqGhoaG8vNzJyenixYs//fTTzp07AwIC8vPz7e3tcUdru6tXrx48eHDo0KGLFy9ms9nDhw+vrKw0NDRcv359UFAQ7nQtEYlEgwcPfvLkCe4gHSorK+vBgwdz585NS0tLSEgYM2aMpszkyMnJSU1NTU5OTk1NlUgkCoXC3d3dw8PD3d0dy9XvADQHSglQX7W1tUVFRR4eHn///ffGjRtXrVo1evToyspKNZyp+r5u3rx58ODBwMDARYsWGRsbNz4fGBg4adKkL774Amu6VomOjvb29m7HtfY1CI/HO3jwoFwuX7NmTW5urpWVlWadH8nIyEhJSUlOTk5JScnMzCQKCtFR3nlHawBUCkoJUC91dXX5+fndunWLiYlZvXr1J598Mm3atLq6OnU+l/FeHj16tG/fPnd390WLFllZWeGOAz5UTEzM8uXLV69ePWbMGNxZ2kIulxMFhegoFRUV7u7uvXv3trW19fDwsLCwwB0Q6BYoJQC/urq67OxsX1/fly9frlq1asaMGXPmzOHz+Wo+tfB9RUVFHThwoFu3bpMnT9aCz6O7d+/u0qXL2LFjcQdRC9nZ2U5OTlu3bpVKpV988YWmnNb5Lz6fn5KSkpeXFx0dnZycLJfLiUEUT09PDw8PWBwFqBqUEoCHQCBITEwMDAwsLi6ePn16SEjIokWLtK+IEOLj4w8cOMBkMpcsWaI1p/BPnz5dUVGxYsUK3EHUCHELQF9fX1tb2z179owZM6ZLF82+83NlZSUxiJKUlJScnGxiYkJcdUzUFLyL9AOtBKUEdBy5XB4TE9O9e3cGg9GvX7+goKAtW7ZIJBLNunb3vWRmZh44cIDH4y1ZsqR79+6447QnYpFcuI6jOefPn3/x4sWuXbtev35NpVI75raOqpafn5+SkkIUFBKJJJPJvLy8PD09vby87OzscKcD2gBKCVC5ly9fOjg4GBsbjx8/3sbGZs+ePVrcQhqVlpbu378/Jydn2bJlvXs3cd9aoCMKCwsXLFgwderUadOmyeVyCoWCO1G7SUlJefXqVVJS0qtXr+rr64l2QtQUONED2gZKCVCJjIwMDodjbW09f/58hUKxc+dOIyMj3KE6CJ/P379//+PHj5ctWzZs2DDccVRFqVTOmzfv8OHDuINoBmJl2F9//TU/P3/58uXm5ua4E7Wzuro6op0QNcXU1NTT09Pf39/Z2VlDFxACWEApAe2mpKREIpHY29tv2rQpNTV127ZtDg4OWvbR8J0OHTqUmJg4YMAAjVhi6wONHj368OHDcA3Re7lz546hoWFAQEBERISnp6cWTHluUl5eXlJS0uvXrx89epSXl+fl5dWtW7du3bp5eXkZGqrjPSaBmoBSAj4Ij8erqKjo3LnzkSNHrl69+t133/n4+GjrfNWWXbt2bffu3VOnTp0/fz7uLB2Ex+MxmUyYVtI29+/fP3jw4M6dO52cnLS7u4vF4levXiUmJiYmJr569YrL5RIdxcvLCwZRwFuglIC2yMvLc3BwuHPnztatW9euXRscHFxbW6uzH4CeP3++e/duDw+PlStXwql08F6EQiGLxZo0aZK7u/uGDRt0oeEU5/JWAAAgAElEQVQVFBQQHeXVq1d5eXkjRowwMTHx8fHp1q0b/PMBUEpAa5WVlVlYWCQlJc2dO3fBggWzZ8+urq5+czVSHVRSUnLixImCgoKVK1dq+sWfbXDq1Ckej7dw4ULcQbTBzZs3Bw4cSKfTjx8/PmnSJB2p+GKxOCUl5eXLl/Hx8YmJiZaWlt7/Y2trizsdwABKCWgJcaPUqqqqOXPmeHp6bt26tbKy0sDAgEaj4Y6G3/79++/evbtu3bqAgADcWfC4e/fu33//vW3bNtxBtMrBgwefPHly5syZ+vp6LpeLO06HysrKSvgfoVDo7e3t4+Pj7e3t6emJOxroIFBKQBOIM9zz58/Py8u7c+dOQ0NDfX29jY0N7lzq4ubNm9u3bw8LC5s1axbuLDjJZDKRSARD7iqSn58/d+7cL7/8Mjg4GHcWDKqqqhISEuLj4xMSEkgkEoPB8Pb29vX19fHx0aw7DYH3AqUE/EOpVJJIpH379t26devSpUv6+vrx8fE+Pj64c6mXjIyM7du329jYrF69Wgcn84IOVl1dHRcXN3jw4IcPH5JIJDW/d7TqKBSKuLi4+Ph44kSPg4ND9+7du3fv7uPjY2JigjsdaE9QSgC6ffv21atXv/76a3t7+zt37vj6+pqZmeEOpY527tyZkZGxdOlSb29v3FnUxZQpU44fPw6fXFWtpKRk586d/fv3Hz9+vEAg0Nx767SLtLS0uLg4oqbo6en5+Pj4+vp2794dpqFoASglOiojI+PKlSvBwcF+fn4XLlzo0qWLn58f7lDq69GjR+vXr1+4cOHkyZNxZ1EvoaGhO3bscHR0xB1EJxCX6nz99dcIofXr10MXJK7liY+Pj4+Pj42NlUqlvr6+PXr08PX17dSpE+5ooC2glOgQsVh869YtfX39wYMHnz59msFgjB07lsFg4M6l1kQi0fr166VS6aZNm/T19XHHUTtCoZBGo+nChaxq5c6dO97e3mZmZrdu3Ro9ejTuOOqirKzs5cuXL168ePnypVAo9PX19fPz8/Pz09YV6rQSlBLtl5WVlZubGxwcfO3atcTExNmzZ8Ots1opPDx89+7dmzZtGjx4MO4sADRhw4YNr1+/PnLkiFQqhWvi3lRRUfHy5cvY2NjY2NiGhgY/P7/evXt7e3vDCIqag1KitZKSkjw9PfPz87/88stZs2aNHDkSdyJNwufzt2zZwuFwiKFy0JwTJ06IRCLdWcRWDYlEIiaTeePGjejo6JUrVxoYGOBOpHaqqqpiY2OzsrLu3bsnkUh69OjRs2fPnj17wuQ5NQSlRAtVV1dPnDgxKCho06ZNMpkMhtbf1+3bt7du3bp//35tvfhIoVCIxeJ22VViYmJWVtbEiRPbZW9vgrNC7ysiIkJPT2/gwIFw3VwLSktLX7x4ER0dHR0draen17NnT39//549e8KV7WoCSomWUCqVa9asSUxMvHnzJp/PVygUMAGibdasWUMmk7ds2YI7iAqJxeK6urr22htxMXl77a0Rl8uFiZxts3nz5qSkpAsXLuAOou7y8vKio6NjYmKio6Pt7Ox69eoVEBDQo0cP3Ll0GpQSzfbs2bPLly+vXbuWy+Xev39/0KBBWnxbL1WLiYlZvnz5hg0bhg4dijuLarVvKVERKCUfIisrq0uXLsXFxU+fPtWFG1Z/uNTU1JiYmCdPnsTHxwcGBvbq1SswMBBmyHY8KCUa6c6dO6ampn5+fgcPHnRzcxs4cCDuRBrvyJEjMTExe/fu1YVfhO1bSmpqaoyMjNprb42glHw4uVy+c+dOgUCwefNm3Fk0hkwme/bsWVRU1LNnz0QiUa9evYiCAosldgwoJZqEmLt6+vTplJSUzz77zMLCAnciLTF//vzg4GDd+UDZvqWkqqrK2Ni43c/gQClpL8TqJr/88guHw5k+fTruOJqktLQ0KiqKKCgBAQHu7u79+vXTwVtvdiQoJZohPT09LCxs9uzZc+fOxZ1Fq+Tm5s6YMWPfvn06tXbcf0vJwYMHIyIi3tps6tSp06dPX7ZsGZ1O37VrF7GZo6Pj8OHDG7fJzs5eunTphg0bAgIChELhpEmTJk+eTNwSKC4u7sSJE3l5eTQazcXFZdWqVUZGRpGRkdu2bfv+++89PDwad7Jq1SqJRLJ///433x1KSfuSSqUHDhwYOnSos7MznU7HHUfzvHr16uHDh5GRkTwer2/fvkFBQX369MEdSgvB5Hb1JRaLjxw5UlZWtnHjRi6Xe+/ePRaLhTuUVrlx48axY8fgB0sgkUhvfYxu8tashw4d8vb2trKyanlv2dnZGzZs6Ny588yZM8vLyxMSEmD0Gy8ajbZixQqFQiGVSgcPHrx+/fr+/fvjDqVJvLy8vLy8lixZUlpa+vjx4wsXLnz22Wf9/geuLm4vUErUjkKhuHfv3rBhw/Lz8xkMxsqVKxFC7/wdAN7XgQMHysvLL1++jDuIuiCRSB9//PE7NxOJRLt27dq5c+ebU6plMtlbm0VGRsrl8s2bN3O5XNVdoQPeF5lMZjAY4eHh4eHh/fv3Ly0ttbS0xB1Kw1haWoaEhISEhCiVysjIyMjIyEePHtXU1AwaNGjQoEGwNOUHIuMOAP7F4/EQQsHBwenp6QghFxeXsLAwWApJFb788ksbG5tNmzbhDqJ5mExmUVHRpUuX3nxSoVC8tZlYLCaRSI1rjEIjUSuGhoZhYWHENbGLFi0ijjzgfRH3bf7mm2/27t37xRdf1NXVLV26dMqUKYcOHcrKysKdTlNBKVELKSkpISEhBQUFCKH79+8vW7YMdyJttnjx4mHDhk2YMAF3ELVT8YbmtmEymYsXLz579uybh93/zlHw9fVVKBR79+5taGj4707q6urefK//DrSAjhEYGDhr1qy4uDjcQTSep6fnsmXLfv/9982bNyuVym+++WbChAk//vhjRkYG7mgaBk7f4JSVlfX8+fNp06bxeLzvv/++c+fOuBNpv08++WTevHmBgYG4g6gdhUJBTFAlXL9+vbkFVfv16xcVFbVr1659+/Y1tzd/f/+ZM2eePXs2NjZ27NixoaGhb07c+e67797aHq5owCUgIIB40L9//927d+vUjG9VcHZ2dnZ2nj9/fkFBwV9//bVnz57KysoRI0aMHDkSzpS1BpQSPBQKRXl5+TfffEMMivTs2RN3Ip0wefLkr7/+2tvbG3cQdUQikdavX9/4Zcur8C1atGjhwoXHjh0LDg4mLjr97zZTpkwJCgo6f/78hQsXHj58+P333xsbGxPfmjNnzpv3RTt27Fh7/klAmzx48ODKlSt+fn4CgYDNZuOOo/E6deo0e/bs2bNn5+Tk3Lp1KywszMrKimgnMLO+BXD6pqM9ffp04sSJCoXC0NDwwoULcFFZh9mwYcO3334LjaQ5JBIp4A0tzwLhcDgrVqz4448/EhISWpgyYm1tvXLlyq1bt5aVlb05DcXd3f3N94LbjqgDEok0adIkhNDRo0ffmjMEPoSjo+PixYtv3LixePHi9PT04ODgHTt2PHnyBHcuNQWlpONkZmYSC2Ps2bOHSqXCGgwdacmSJf3793dzc8MdRHv4+vqOHDnyxIkTxESTFrb09vZ2dnYm/v4D9bdkyZLs7OyKigpYxap9de/e/euvv378+HHfvn0vXLgwcuTII0eO1NTU4M6lXqCUdITs7Ox+/fqJRCKE0LRp0+zt7XEn0i179+4NDAwcNGgQ7iDaJiwsrLnlGeRyuUAgIB7z+fzS0lJo4Rpk9erV+vr6cXFxsbGxuLNooT59+uzfv//YsWNisTg0NHTVqlXPnj3DHUpdQClRrVu3bhGXR965c8fLywt3HF0UERFRU1MDq2urApPJXLVqFZlM/u81pTExMXPmzNm9e/exY8dWrFhRX18/btw4TDFBWzCZTF9f319//bW0tBR3Fu1kYWGxaNGiP//8c9SoUadOnVqyZEl4eDjuUPjBRFdVkcvl48ePHz16NHEGHXccHZWamnr+/PnTp0/jDqK1XF1dQ0ND/zvOz+VyHR0do6KiSCSSo6PjkiVLunXrhikjaLtDhw4VFRXBGmsqNXDgwIEDBxYXFx8/frxfv34zZsyYOXOmzo4swr1v2l9WVhaXyzU0NKysrLS2tsYdR3cplcpp06YdP34c7vTxlva9IR9xNRmZ3M7DrnDvG/XB5/Pnzp177tw53EG0n0AgOHXq1MmTJ0ePHv3pp5+ampriTtTRoJS0swcPHvz888+nTp2CX4TY7dy5097efvLkybiDqJ12LyWqAKVErWRkZOTm5g4bNgx3EF1x+fLlw4cPDx48eOnSpTp1CTHMKWk3xIww4kJfaCTYpaWlJSQkQCPpGPX19XK5HHcKoEIuLi6DBg2C9Uk7TEhIyJ07d+zt7YODg3/66SfccToOlJL2cfbs2Zs3byKEfHx8cGcBCCG0efPmdevW4U6hK5RKJYy5aj0ajebo6Ni4AizoAJMnT378+DGLxRo6dKiOLB4Dp28+VFVVlYmJybNnz2DlcvVx9erV0tLShQsX4g6ipmBOCWgzpVJZUFAA6xp0vO3btyclJW3YsMHZ2Rl3FhWCUvJBwsPDJRJJa274DjrSsGHDzpw5o4NzxFoJ5pSADyESiTIyMuByqo6Xmpq6cePGHj16rFq1CncWVYHTN20nEonS09Ohkaibu3fv+vr6QiPpSHw+XyKR4E4BOgiTySwqKlq7di3uIDrHzc3t/PnzNjY2EyZMyM7Oxh1HJWCkpI1u3rw5dOjQ5m6jCjCaM2fOihUr4GNcCxQKhUwma8cdnjx5snPnzv369WvHfVKp1HY/JQTaUU5ODoPBsLGxwR1EF1VVVS1cuHD8+PFTp07FnaWdQSlpi7CwsO+++87Kygp3EPC25OTkHTt2nDx5EncQ3VJXV0elUvX09HAHAR2qqKjI3NycRqPhDqKjfvjhB5lM9tVXX+EO0p7gg0hbrFmzBhqJerp///7MmTNxp9A5BgYG0Eh0UG1tbVhYGO4Uuuvzzz8fOHCgli18ACMl7+fhw4fe3t6Ghoa4g4CmDRky5NKlS0ZGRriD6JaTJ08yGAwtOziC1rh7966JiYmfnx/uILorKytr/vz59+7d047TndrwZ+gwa9asEYvF0EjUVlpamoWFBTSSjieVSquqqnCnABgMHToUGgleXbp0CQ8P15ohKxgpaS2BQEAikXRquV+Nc/ToUaFQuHjxYtxBdI5IJJLL5XAGRzfdv3+fRCINGjQIdxCdVlVV9fHHH9+9exd3kA8FIyWtUl1dHRcXB41EzT158qRPnz64U+giJpMJjURnBQQEbNy4EXcKXWdiYvLzzz9rwYqRUEpaJSwszM7ODncK0BKFQlFTUwPL/GNx9epVnbo9B3gTh8P5/fffeTwe7iC6zsnJaciQIVu3bsUd5INAKXm3jIyMLVu2dOrUCXcQ0JKcnBy4NBEXmUzW0NCAOwXAxsjIiMPh4E4B0KRJk0pLS588eYI7SNtBKXk3FxcXd3d33CnAO+Tk5Dg6OuJOoaNGjBjx6aef4k4BcBo1ahQUU3Wwe/fu48eP407RdlBK3uHatWtHjhzBnQK8W3Z2tpOTE+4UOorD4ZiYmOBOAXDq27dvdHQ07hQAUalUf3//X3/9FXeQNoJS8g6nT58ODg7GnQK8G4yUYPT48eNDhw7hTgFwWrNmzeDBg3GnAAghNG/evIiICLFYjDtIW0ApeYdLly7BbBKNUFpaCrfhwKWuru7169e4UwCc5HJ5bW0t7hTgH8HBwefPn8edoi2glLREIBAIhULcKUCrVFZWwrJpuPTq1WvOnDm4UwCcKBTK0KFD5XI57iAAIYTGjh17/fp13CnaAkpJS1atWpWYmIg7BWiVmpoaKCW4GBsbd+7cGXcKgFnv3r1LSkpwpwAIIeTg4ODh4ZGamoo7yHuDUtKSuro6uO5GI/B4PAaDAZcE4xIbGwvzwcHevXttbW1xpwD/MDExiYmJwZ3ivUEpacmZM2f09fVxpwDvVl1dbWxsjDuF7qqpqcnIyMCdAmAmFAolEgnuFOAf/v7+8fHxuFO8NyruAOpLLpeLxWI2m407CHg3Pp8Py5xj1KNHDzh9A3755RczM7Pp06fjDgIQscJWcnIy7hTvDUZKmhUdHf3ll1/iTgFaRSwWMxgM3Cl0l6GhISwSA4yMjGCkRH2YmpoKhUI+n487yPuBkZJmSSQSU1NT3ClAq0gkEjqdjjuF7oqNjY2OjtaCm4GBDzF79mzcEcD/079//9evX3ft2hV3kPcApaRZ/fv379+/P+4UoFWglOBVW1ubm5uLOwXATCwWKxQKuJu6+qipqamursad4v1AKWkWn88Xi8UwfVIjQCnBKyAgwM3NDXcKgNmVK1eKiopWrVqFOwj4h6GhocataAdzSpr1+PHjkydP4k4BWkWpVMIiJRhxOBxra2vcKQBmRkZGVlZWuFOAf5mZmYlEItwp3g+MlDSLQqEoFArcKUCryGQyHo+HO4XuevLkSWRk5OrVq3EHATgNHz4cdwTw/8jlcoFAgDvF+2m6lEilDRKJrt+Eulcv91693Pn8YtxBwLuJRDVyuVBt/2fp6Wn5KIJIJNK4U9dtIBbXymQadojvSDKZTKlUwhqG6kNPjyKTNajngZFMprJY5v99nqRUKv/7bHLyr5mZZ2k0nV6iQy5XyuWITifhDgLeLTFRlJEhCQnh4g7SBIGgIjT0Be4UqiWRSMRisdavNPj8+Telpc+oVJi91LSoKEFNjXzkSC3/a6BB7t/nU6mof3+1W8NJqVSQycyRI6/991vNnr5xdh7p4fGRioOptYiIBy9fJn/77WLcQcC7kcmRfP7LUaM+wx2kCZcuheKOoHJ0Ol1HJhr7+My0tw/CnUJNCYV3iosrRo2CxdPURWHhBQqFPGqU2h2CeLzSR4+2N/ktmFPSLAaDzuHo9FiRBqHTqebmcJ0UNlFRUZGRkbDYoI4LCRmGOwL4fwwMOE2eDFFnUEqaFRzcOzi4N+4UoFUkEll5ufbPaVBbAoGgsrISdwqAGZ8vkMnkBgZw+kZdNDTwNa2TQClpHo/HF4kkpqZwoSkA79C7d29vb2/cKQBm16//XVRUvmrVHNxBgAaDdUqa9eBBzIEDZ1ve5vXr0h49QiMiHjQ+U15etWDBxje3WbBg4+jRLS2/PWDArL17m14QpbCwZPnybQMGzOrTZ9r06V/Fx6e+5x8CgI7AYrHgngyARqMyma2dWsTnC3bs+G348HmBgR+PG7ckPPyuitMhhNDnn++cPv2rDngj0GYwUtIsKpVCp7/j2jZra3MulxMVlTB69ADimUePXsTFpTY08PX19RBCQqEoPj5t/PhBbQggFIoWL/5OLJbMmDFWLJY8e5ZAIsGlQEAdPX78+MGDB2vXrsUdBOD0XnNKNm/+5cGDmI8+GmZkZJCYmN7QoGH3jQMqAqWkWcOH9xs+vF/L25DJ5N69fZ49S1QoFGQyGSH08OELuVz+9GncsGF9EUIxMUkymaxfP782BIiPTysuLt+164sBA3oihBYt+vi9Xq5UKlVaYlS9f6BBxGJxfX097hQAM7lcrlQqqdR3/1qRSCT37z+bPHnEypX/3MOvveZjwnFP00EpaZZYLJFKpRzOO67wDgrqcfv247S0HHf3LgKB8MWLJBKJFBkZS5SSp0/jmExGjx6eCKEXL5IOHDibkZFnbGzg7++5ePHUxgkrmZn5YWHr0tJyLCxMpk8fM3FiMEJIJBIjhJocrRGJxL/9dvnu3afl5dVWVqajRvWfM2fC339Hr169e9euL06d+iM5OWvWrHFhYRMPH758586TsrIqU1OjUaOC5s//iEKhEMOY2dmFrq6dnz1LIJPJffp0X7FiprGxIbH/JqPW1tYPGRL22Wcz0tNzHzyIcXXt/Ntvm1Xwgweap0+fPr6+vrhTAMwuXrzdyjklMplcqVQyGP+e63nzN/3ly3dOn44oL6+ytjYfPrzvjBljGQy6RCJp7mj20UcrnZzsnJzszp+/JRKJb9/+lcPRi49PPXTo0qtXmQghPz/3BQsmu7o6Evs/dOhiePg9uVwxZEjgypWzGq9mb/J94bjXwaCUNOvevacvXrx7nZLevX2oVGpUVIK7e5eoqASlEk2YMOTevacymYx4vmdPLwaDHh2duGzZtpEj+02ePKKuruHcuZsLFmw8fXoHk8lACGVk5M2YMXb48L43bjzcuvWQUCieNm109+5uTCZj//7T1tbmDg42je8ol8uXL98eH582ZcoIFxeHnJzC/Pxi4h8nQmjHjiOLF3+8cOHkTp2sKBTK8+eJQUE9bG0t0tNzjx69wuVypk8fQ2xZXl4VEjJ0xoyxqanZBw+ez84uPHlyG5VKbS4q8aojR8JDQ4f98ssGCgUmJIF/MJlMJpOJOwXATF9fz8ioVQsYstksD48uly7d8fNz7927+5vfOnTo4unTEVOmjHB0tM3LKz558npBQcmmTUtbPppFRcWLROI9e74SCEQcjt6zZwmffbbN2dl++fIZCoXy0aMXMpmc2DItLYfJpC9bNj09Pffs2RumpkZz54a08L7Eq+C412GglDSLyWRwue9eCI/D0fPxcY2Kig8Lm/To0QtfX7eJE4dcuXIvISHd1NSoqKhs9uzxCKHvvz82ceKQL78MI14VGOgdErI8Kip+4MAAhNCoUf1nzhyHEJo4cUhY2Lpff704ceIQQ0Pujh0rN2z46aOPVg4d2nvx4qlWVmYIofv3n714kbRu3cJx45qYqjJ58vDGCS4IoRMntjV+BHn9uuyvv543/jN2dLQjHnt4dOFw2GvX7n/6ND4oqEdzUbt3d0MIeXm5LF48tT1+wEB7REdHP336dPny5biDAJzePPK805Ytn61a9f2yZVt9fFwXL55KHF4qKqqPHr26ZctngwcHEpuZmRlt23Z41ao5XC6nhaMZlUrdunU5i/VPM96165i1tdnRo5uJUZDQ0H8nu9jaWv7667cUCmXUqP65uUX37kXNnRvSwvsSX8Jxr8NAKWnWkCG9hgzp1Zotg4L89u07XV/Pe/Ikbt68UFdXR1tby8jIWEtLU4RQ376+JSUVubmvCwtLr179880XlpVVvbUrCoUSEjL0229/SknJ9vPz6NPH9+rV/efP3zx16o8HD2L27//a19f96dN4BoM+enT/JsP07On15pfV1XWHD1969iyxvp5HfJRp8lXEh5WkpExnZ/uWo761fwAQQg0NDcXF6nh/DdCRlEqlUqkkZte9k62t5ZkzO2/dijx06NKnn65ftmz6zJnjnj9PlMlka9fuW7t23//2iYhhXS6X08LRzNOzS2MjKS4uz8srWrx4apOrDHM47MZxZScnu1evMhBCLbwvcZIdjnsdBkpJs2QymUKhaM3i2UFBPXbvPvHbb5dra+v79++BEBoyJPCvv57b2lq6ujqamRknJWUihObNCx00KODNF5qaGv53b2ZmxgghHu+f+37p6+t9+mno2LEDw8LW7dx55Pz5H6qqas3MjBv/Xb2FzWY1Pq6qqp027Us2m7lw4WRbW8uDB8/l5zf9m4PDYZNIJIFAVFVV21xUYvyTxWK88wcCdE2vXr26deuGOwXA7Pz5m++1TgmFQhk9esDQob1Xrdp18OD50aP7V1bWIIT27l1jYWHy5pa2thYtH80aGwnxSQwh9NYemgtAHNZaeF/iUAzHvQ4DpaRZt28/bs2cEqLyOzjYXLx4x83N0cLClBhlOX7896Ki8jlzJjQ2epFI/ObUkObU1NQhhExM/l9fsbAwHTas74kTv0ulUn19PaI6vFN4+N3q6rrjx7dYWpohhCwtTZsrJRUV1Uql0sLCpIWotbXqe3kFjUaFZe4wYrPZbDbck0HXMRh0Nvu9pxbR6fQpU0Y8fRqXmVnA5XKIJ/97/Gn90Yy4PUgrD5KEFt5Xo+npsTRumXmYs9MsDofd5EhGk4KCeshkMuLaXYSQq6ujnZ2lXC4nLgbu1MnK0tL0+vW/hUIRsYFMJpNKpU3u6s8/n3G5HBcXe2KR4MbnMzLymEwGmUz29/cUCkV37jxu/JZMJmtyV7W1DUZGXOLfMPFlc38/r137CyHUrZvLe0VVH1KpjPisA7CIjIzcuHFjKzYE2mzixODWr1zw1sENIcRmM/39PUkk0oULtxq/1Xggav3RzN7e2tzcJCLiYeOBUalUKhSKFsK08L4ajc8X8vka9geBkZJmDRjQs7FkvFNQkN/Jk9cGDPBvfGbIkF7Xr//t7u5EXO32+eezv/hi1+zZ34SEDJXL5RERD0eO7Dd16mhi44iIhyYmhiwW48mTuMjI2C+/DKPT6bW19WPGLA4I6ObgYJ2UlPXiRdLs2eMpFMrIkUEXL97esOGn5OQsFxeHrKyC588Tz5zZ+d9UPXp4XLx4++efz3t7d/3rr+dPnsQpFIra2npDQy5CKDu74MCBM506WSUkpF+79lefPr7e3q4IoZajAvBfEomEz4fFr3Tde80pWbhwE5NJ9/Jyqaiovnv3qbe3q4dHFzKZPGXKiHPnbq5YsX3AgJ6VlTUXL97et2+Nq6tjy0ezN5FIpGXLpq1du3/27G/GjBlAJpNv3Hj40UfDR45s9vbOdnZWzb3vB/9UwPuBUtKs1s8pIeZme3o6Ozl1anwmOLh3VVVt41zxgQMD9u5d/csvF3/44TiHw+7e3dXX1534FoNBnzFjbETEg/z8Yhsbi8bLaqRSWb9+frGxyc+eJdjZWX7zzfzx4wcT2//yy4Yffzxz82bklSt/WlubDx3au8nBkkGDAufODbl48fbFi7eDgnocP75l/foDFy7cnj//I4SQsbFhUlLWhQu3mUz6pElDly6d+s6oADSpR48ezs7OuFMAzFo/p0ShUPj7e/711/MLF26Zm5vMmDH2k08mEG1m5crZFhamFy7ciopKMDU1HDiwJ3ED8KIyjC4AACAASURBVJaPZm8ZPrwfk8k4fPjynj0nDQ313dwcO3WyajlSc+8LOhipyRNOycm/IlTr4dHE/2ytN336VyQSSS6X19U1SCRSS0szuVwuFIqvXt2PO1p7+vzznWVlVY2rj2i6W7cinzx5+d13n+EO0oRLl0JDQ1/gTgHawfPn31hautjbN/uBWzd9/PEqGo0qkUgbGvgKhcLMzFgikUqlsvDwfbij6bpDhy4qlajJ3oYXj1f66NH2kSOv/fdbMFLyNiMj/adP4xtHOGpq6ol73ODOBYD6iomJiYqKWrZsGe4gAAM2mxkfn9Z4zKyoqEEIOTra4s4FNBJMdH3b7NnjG2diN5owoS131ANAR9TX179+/Rp3CoDHrFlvHzMZDPqUKSPwJQIaDEZK3ubn5+np2eXNwRJra/MpU0bhztXOfvjhS9wRgPaAe9/osqCgHm5ujtHRrxqfsbExnzhxKNZQQFPBSEkTZs0a37hMCIVCHjdu0Jsr8wAA3sJkMo2MYJ0Y3TVt2ujG9VXpdNrkyTBMAtoISkkTevTw9PTsQjy2s7OaNm0M7kQAqLX4+PjDhw/jTgGw6dPHt2tXB+Kxra3lpEkwTALaCEpJ02bOHGdiYkgmk8aOHchktuqqYAB0VlVVVWZmJu4UAKfZs8fr6+vR6bSPPoJGAtpO7eaUCOSKhv/dYxojK7cuzv5eBQWlA8YHl4nxr2dKJSETOg13CgCaFhQUFBgYiDuFjuLJ5Hx5S8uVdozO3T0cfT1qaur7jRmsDsdMGgkZwzFTA6lRKQkvqrhaUqlQIgVSj7X6xw1hI/Rleh7uHAghZMNk5AtEg80MFzpq1a0Z2gudDve+wWDMmDGNNwdWKpXE3HClUvny5Uvc0XTC+cLyP0qraGSSqMU11DvOlFF0hD5LVIsxM0sGvVgkCTYz/LSzNe4s2HA47JbX11dD6lJKfsgs5EllU23Nods2hyeT5wqEs16kHvbtSm/dQs66QyKBe99gMGvWrD179ojFYmJtb6KRBAQEtOKl4EN9l5ZPJaHpduZGcMxsRr1UlsMXLYjPOODtTP3f1ZQ6hccTaNr9+NRjTsmuzEKJQjnM0gQaSQs4VIoXlzPK0mR+XAbuLAAghFBISIi19f/7GMrlcmfOnIkvka7YnJbPopAHmBlBI2kBl0b1MeT0NuIuTVCLwRvQGvhLSXI9v14qG2jW2vvx6rhObKYXV+9KUQXuIAAghNBHH31Epf474Orm5tarVy+sibTfi5p6pVLZx8QAdxDN4KLPdmSzbpVW4Q4CWgV/KcnkCXVyXK3tDGjUhHoe7hQAIIRQaGiojc0/85y4XO60adNwJ9J+6TwhCQ6a74NDpSTUw12sNQP+UlItkVoxGLhTaBJLJl2qcecJgfaaNGkShUJBCDk7O/fp0wd3HO1XJ5Nbs2CdgvdgyWSIFXDM1Az4S0mdTA6/Yt+LQomKhBLcKQD4x5QpU2xtbdls9vTp03Fn0Qm1EqkMfsW+D6VSUSIU404BWkVdrr4BAHSMEhG/Tipx1TeKr6tIqKuqEAu9DUyDze0S6irvlRe27XHA5ImPGypFXR0QQh+yn4S6ymc1ZV5ck97Glhm8WqRUuujDld4A6BAoJQDoBJ5MyqHSLhdl3SzNUyBEIZFqJOIGuZREQnfLC77PfKlEqO2PbTgkEuf7zJcfuh+EEEIRJblGNAaZRFIolR5c4+VdfCrFQhvW2/fuBgBoHyglAGi/n3NeJdZX1UnFlRLRm8+TEGpcqlBdHiMkVshLxQLimdIKQWRVsQOba0JnfuXiy6bAFbAAaDMoJQBosxIR/35F4a3SfJES/90b2kaiUGTwahFC32fETevUtYseXAoLgNaCUgKA1jqRnxpXV5HSoCVr3T6pLnlVV2mvp/+DVz/cWQAAKoH/6hsAPhydTjUzgxmR/8+LmrJbZXla00gI9XJpWn3Nnqx43EEA0AD6+nocDgt3ivcDpQRoA4lEVlGhVb99PxBfJt2XnVAt1cJLx6VIeassP7qmDHcQANRdQwOfxxPiTvF+oJQAoG3ulxUujn9QJtawg9F72ZwavTrpCe4UAIB2BqUEAK1SLhYcK0wt/t/VK9pKrFTkC3h/V7zGHQQA0J6glACgVeqlknKtHiNpVCUVpfHgnB0AWgVKCQDao1TE35edgDtFx/mjJPf86wzcKQAA7QZKCeLV1b94+OhD9lBW+Dr1ZVz7JQKgjQ7lJqfzanGnaFbCNztilnzTjjuUKZXXS3Jr/v+KcEDV4JgJVEfXS0lVWdnSMROvHDnR5j08+/Ovzz+a9uLh43bNBcB7E8ilYoVar5BWn5HN6WzXvvukksgKBHen6zhwzAQqpeulRCaRSqXSD9mDkM9vvzgAtB2bQquSqu+YgbSBLyop53S2b9/dlooFArlaVzEtA8dMoFI6tKKrVCI59+PPz/9+IBIIrDp1mvDJLAdXl88/moYQKsjMmt6rP0Jo37VLJubmz/78O/y3oxUlpTQqrYuX+5TF8+2dnRFCt85fOrPvgF9QPwGvITsllclkhsyfe3THDwihOxcv37l42dzGevflc7j/oEBH/V6ck8OvV9HOFRJp3rmrJbcfiMoqGKbG9h+Pt5swAiGUd+ZKyb1It5XzMn463pCZyzA1dl05zzTQl3hV+ePovFPhDVm5DFNjm1FDEEIcx07tnu2rpCdn/Ye1+25BRUnJqT0/pr5MIJFJjq4uM1Z+xmSzmjxmXvj50JNbd+tqqvW4XO/AgKlLF+kbGiCE9nz1Teyjx8GTJqTEviwrKnL18e47YviR7bvgmAmao0MjJdeOnbx7+QqVRvf092+oq5dKJAwG06d3IEKIzeEEDhkYOGQgg8FECMmkUrlM5uLloW9k8Op5zI7lX0hE/34AjX0U2VBTGzh40ICxo206O3R264oQsuxkGzhkYPc+vbH+EYFOe1hZpKI9K6TSl59vzD1x2axvT481Sw08XdN++LUhOx8hJBeJedl5ydsPWAYHuSz7RNrQkLb3MPGq19fuJKzeSmEx3FYtMOsXkHXoNEJIr71P3yCE5EplgUBVbUyX/bxxy8vIJ5adbFy8PHLTM1h67OaOmfy6en1DA5duXkihiLx5+9CW7W/u5174VSMzU99+fQdPHGdmbQnHTNACHRopKczJRQhN+nRO0MjhMqlUiRCNRpuxfGn802emVpZLNn/buGWf4cF9RwwlHhNNP+VlnE/vXsQzZtZWm47+SmcyiS8HjRtzJDXdOzBwxoqlOP5YAPzDkqmX3FCtij3nngqviUvy3rbGvF8AQohlbVl696G4okrfyV4mEFLZrB4HtjCMDRFCDenZxTfvI4SEJeXp+46YBwV22/IViURCCPGy83k5+XQDbrvHs2FyTOgatpa2RijMykUIfbZ1k6mlpUggZLJZCKEmj5lzvvqc+L8sEgi/mDw9/kmUgM9n6+kR3w0cMvDNjeGYCVqgQ6XEp3dg7KPHZ/YdqK+qHjRhDJuj39yWNZWV10+cfhUdU11eSSIhhFB5cUnjd7v36d3YSICaIJEQlUrBnQIze3azf6U/hFKpLLxyi2lpru/kIK6s5uUWZP16mm5kYNjNDSHEzyvUc+xENBKEkFwoonH1EUJFf9xTyGTOi2YRv6v+r737DIyiatsAfLbX9N4TEkIJhI5IlyIQULDQBFFQwQYoCCrwguUVC0UFFEVBFAURAWkC0kPvCTUF0nsv29t8P5YX+SAJKbt7Zjb39ce4ZeaeaJ599syZM4QQk0qtjLD9uRtCiMZiVAhF9thyM9epd49T+w8ueXvuyBeff2TQgDpemX4zecfPv2YkJVdVVjIWM8MwpQWF8sgW1md71PlesB8ej0e4Ng28GZ2+eWzkE5Nmz2QY8vu337/19PjEU2dqfJm6unrRlFcPbv1LoXTpM/TxgNBQQohe8+/pG5kc38lYh2GIydTcZzvaaXlTXX6RsaLSWFV9Ysy0+FFTLr39gUAm7fzVR0K5jBCiSs9Shv97RkadnScPDSKEVFy9KfX1lgcHWB9nGEadmWOnpqRYrz1Xhlvh2N7L780Z+PTI/Oycbz/47/sTJhfl5dX4spQrVz+c9vrlk6eCoyL6xA1RurkRQvT3nPKWyuUOTA3/YhiG4VhP0pxGSgghjz/7dJ9hj2/9cf2+37es/uiTVbu2WR+3WCx3X3P+aHx5SUnXfn3f+uxjQshfP/2SmZL60P+wzD1bAKDCxNjlf0KLyUQIaTV9ikfHGGOVShrge3dcxKTW6ItKFf9rNRiLRZ2RHThsACHEUF4h8fr3vs3VKWlmjdYes1wJIXzCE/F59thyMyeWSifPmRX33Lh1ny27fuHCr1+tmvXFYutT99bMw3/tNJvMk2bNeHz0M4SQguzcipJS1ExonGY0UmLQ68uKi2UK5cSZb8rkClVllbqqWqpQEEJKC4qsU1mNRqNOoyGE+Abe+YaXevUaIcRS+/IPMoWSEJKflWX9QzWZTA48JoB/PRXQwh6blfp5Ex6vKiVNHhLoFhN9tyOxDpMQQpThwdZ/1eYVWvQGRXgIIUTk5qotKGL+d7Fuxm/bCCEK+zQlXT18O7r52GPLzVxZcbFBp/MLChz3xlRCSH5WNiHkwZqpVWsIId4BAYQQrVqTc+sWIcRS+3XaqJlQh2Y0UnL6wKFfln8dHdveoDdoNeqAsFA3L09CiG9QYFFu3pyxz8tclEPHPNMqNpYQ8s+fWwtzc8uKitOTku/+/dSoRdtWfIHg6rkL702crFWp561c7hcS7NgjAyCEkMd8glekXbH5ZgUSiW//R3N37ucJ+K6to1S3M93atfLr9yghRJ2eTQi5O1Ji7VEU4cGEEN/e3VO+WX998Qrfvo8WHT9bdOw0IeTeEz02FOPqdXfmCtjQH6t/uHrufFRM27zMLEJIm84dCSFunh731czWHTtcjD/xw+LPW3fokJaUXFVRSQjJz8xq1SG2xs2iZkIdmtFIiYu7u39w8I2Ll3LS0rr07T176Z2L1t74aGFYdMvK8rLy4mKlm1tEm1avzH/Xy8/vyumzhEfmfLkkMCws7WZSbesF+QYGvvz+HC8/v/zMTMZiEUkljj0sgDuuVZX62OcilLZzXvMf1Kfw0ImkZd9XJN6QBwdaH1elZwmVCqmPl/VfrT2KtfMIGT0idNzI0vOJ1/77FWOx+PbvKfXzESrsMrfgMO4VbB+BYWFCkfjyydNatXrwM0899+br1sfvq5mDRz89bPwYPp+feOZMeHTUrC8WK1xdkhOu1rZZ1EyoA6/GM3/Xr39PSEVMzBgHJPjyVo6Qx+vmYZcLB5xSsd74Z17xz11a0w7CInv3Hj958tJ//zuTdpAabNkyevToCw7YUaVB/1ri0ZJmdiMYhUA4ObTNk4F2OXV1n7Nn5/v7R4eF9XXAvuqwODnTSyzq4KakG4ND8rT6f4rKv+sUTTuIo61Z8wfDkGnTHPFR3iAqVUF8/GdxcTsefIpjp28Yhvni7Tm1Pcvj1dxjEUKi27d76qUX7RntjkPbd9Z2qyqRSGw0Gmp8SuHi8ubHi+wcDZycm1gyNbzd4pS6GqDkVT/l7T744OMu0S2qU9JqfEu37z6z4WmX86+/r0qr4WRoHQH6bl8rkNV6Ef5w/wjHdCQcZdDrv3y31psguri7V1fUfAdHltdMD2/vqQves3M0oIB7TcnVs+dre9bd27uipKTGp8QSB40Q5mVk1pZQKpPrtJoan7LObgFoov4+QSfL8utY2jXi+WdCnxlewxO1L2cg8bHl/5ztP5rDGGua2Fh7AH7tw/syvmCAd5AN4zkfi9lSR830Dw0pyMqu8SmW10y/YPx3d04ca0r4fP6vp4/RTlGX59+ejmUKgSK9ua5rGcRursQOa6rWn9Tbli2Om0jiI8G6QXWRymWomcAhzWiiK0BzMDe6S7h9lnZlmwCJ/K3IDi4iMe0gAGAzaEoAnIpSKFrTaUB3Dz/aQezLTSRe0r5XZw9f2kEAwJbQlAA4oT5eAa5CZx5CaCF39ZVg8XIAZ8OxOSUAUB9D/MIEPN6m7NRsnYp2FhuTC4SRCrfP2/WiHQQAbA9NCYBzGuQbGqVw/yUr6URZfj1ezg2uQvGXsb1DZM1i0gxAM4TTNwBOK1zh+m50lxCZ0kPI+UUzxTy+r0T2akQ7dCQATgwjJQDOTCIQrO088ExZQb5O82t2UrWp5rslsBmfkB6eAdMiYswMEyzDMqYAzgxNCYDz6+HpTwgJl7tsyb0l5PHVZmOOVqUxmfSMmRBCeLz/LVzGsOFnMY8v5vO9JbJWSo80dWVnd5/nQ1uL+QL7/o4AgAXQlAA0F53cfTq5+5gsFiGff7Gi6Laqsoenf6jcZU9BRpa2eoR/RIhMyYaf9xVmGs3m/r7BLk59AREAPAhNCUDzIuTzCSFd3H27uN9Z5GO4f/jdZ9nw81C/sMYeHABwG/2Jrm4igYjPo52CS/iEhMg4P28RABrHQywU8VAzG4DHI4EyjLpxA/2mxFssytXqaafgkjy9QYw2DqC58hKJcnU13zsXapSnM8gEmJPEDfSbklZKWW13B4UaVRlNXdxxVSRAM9VaKTczFtopuERtMndyw3Vb3MCCpsRFESSX7Csoox2EG65XqtI1uuH+XrSDAAAdse5KN6HoUHE57SDccKm8ukhvHOTrQTsI1Av9poQQ8mpEYAuFdGdecY5Wb2EwbFKzIr3hckV1YpVmRWwU7SwAQNPbLYO9RKL9BWV5OPddu0Kd4Vx5VbZOv6RdC9pZoL7YcvXNpDD/Q0Xl2/JLivUGvZkVfYmFYRgLIxCwom8LlUvVZvMgH/cVHdCRAAB5tUXg3oKynQUllUaz2mSmHYcQQiwWhhCGz2dFzQySSQwWy2BfjxmRwbSzQAOwpSkhhAz09Rjo68EQojaz4g9s//4Tly8nvffey7SDEEKIkPClAkxuBYB/DfP3HObvaSFEw46auXXrP3l5xdOnT6AdhFjvS4ALAriIRU2JFY8QJTumSUssjNBoYkkYAIAa8VlTM8Vmi8hkZkkY4ChWjLMBAAAAoCmplUgkcseVtwAA9SORiF1dFbRTALehKamV0WisqKimnQIAgBv0ekNVlZp2CuA2NCW1kkhE3t7utFMAAHCDTCbx8MDoMjQJmpJa6fXGkpIK2ikAALhBq9WXl2N0GZoETUmtJBKRlxdGSgAA6kUul3p4uNJOAdyGpqRWer2xtBQjJQAA9aLR6MrLq2inAG5DU1IrPp8nFotopwAA4AahUICaCU2EpqRWFgtjMBhppwAA4AaTyYyaCU2EpgQAAABYAU1JrSQSMS4JBgCoJ7lchgUnoYnQlNRKrzfgkmAAgHrSaLRYcBKaCE0JAAAAsAKaklqJRELcxwEAoJ4kErGLi5x2CuA2NCW1MhpNuI8DAEA96fWG6moN7RTAbWhKAAAAgBXQlNSKz+dLJFgICACgXoRCIWomNBGaklpZLBa9HgsBAQDUi8lkQs2EJkJTAgAAtsHj8WhHAG5DUwIAALbBMAztCMBtaEoAAACAFdCU1AoTXQEA6k8gEEgkYtopgNuEtAOwFya6cohYLPLx8aSdAqBZM5vNer2Bdgr4l4uLwmLh2Ak1NCXgDAwGY3FxGe0UAAAsUl2t5twkH5y+qZVIJMIdLwEA6kkiEePWHNBEaEpqZTQaccdLAIB60usNuDUHNBGaEgAAAGAFNCW1kkhE3t7utFMAAHCDTCbx8MApb2gSNCW10uuNJSUVtFMAAHCDVqsvL8cpb2gSNCUAAADACmhKasXn88RiXDINAFAvAgEfNROaCE1JrSwWxmAw0U4BAMANZrMFNROaCE1JrTDRFQCg/mQyqYeHK+0UwG1oSmqFia4AAPWn1erKy6topwBuQ1NSK6FQoFTKaacAAOAGsVgkl0tppwBuQ1NSK5PJrFJpaKcAAOAGg8Go0ehopwBuQ1NSK4yUAADUn1gskssltFMAt6EpqRVGSgAA6s9gMGo0etopgNvQlNRKIhF7eeHqGwCAepHJpO7uuPoGmgRNSa30ekNpKa6+AQCoF61WV1GBq2+gSbD63v2mTJmfmJhMCOHxeAzD7N59lGGYoCC/nTu/oR0NAIB1Jk5898aNWzwez1o2N23aQwgJCvLdufNb2tGAezBScr8XXhjl7u569w+MEMLn8wcNepR2LgAANpowYbhSKefxeHfLJp/Pj4vrSzsXcBKakvv169etRYvgex8JCwscM2YovUQAAOw1bFjfsLDAex8JDw8aOzaOXiLgMDQlNZgwYYSbm8vdf+3Xr6u/vzfVRAAA7PXccyPkcpn1Zz6fN3RoL6w3D42DpqQG/ft3j4i40/iHhweNHTuMdiIAAPYaNqxPaKi/9eeIiOBnnx1COxFwFZqSmk2c+KR1sKRfv66+vl604wAAsNrEiU8qFDKBQDB4cM97R5oBGgRNSc369+8eEREUHOw3ejRmkwAAPMTQob1DQvxDQvyeeeZx2lmAwzh8SfAP6XmXKlUiHi/TPmsIml981mxhpqUXkPQCm2/cVSQQ8fltXeQvhPr5SMQ23z4AwH1Wp+UmVqoFfF6OfWqmadoEhmFeTMqyx8Y9xQI+j9/OVTElzN9NxOFPLqgbJ//TakymsedvPuHvNcDHw1ciYmjnaQQe4VUYjCUG4+uJqYvbRrTETXYAwG4qDcax528+E+gzyNfDh6s1k5QbTKUG44sXk76KjQrD7YidFPeaEjPDPHP2xnutQiV8bp978pOK/aTiGFfFp8lZMyKDOrrjLCwA2J7aZJ50Mek/rcMEPB7tLE3iLxX7S8Uxror/3Ehf0DosGt/lnBH3PteXp2Y/H+rH9Y7kXi+G+a/LtP0ZIgAAQsiXt3ImhvhzvSO51wuhft+n59NOAXbBvY/2YyWVwTKnuju2mM+vNplvqbS0gwCAs2EYJr60MkTuVDVTIRQW6Ay5WtyR2AlxrCnJ1erbuirETjRMYtVSKcvCHxgA2Fq6RtvVTUk7he21cpFlanS0U4DtcezT3UJIvt5AO4Xtqc0WvcVCOwUAOBszQwoNRtopbK/aZNEzXJywCw/BsaYEAAAAnBWaEgAAAGAFNCUAAADACmhKAAAAgBXQlAAAAAAroCkBAAAAVkBTAgAAAKyApgQAAABYAU0JAAAAsAKaEgAAAGAFNCUAAADACmhKAAAAgBXQlNiSqrLqwrF42ikAALgBNRPug6bEZkoLC6c/8fS2tT/TDgIAwAGomfAgNCX/unX9RlPebjIYjUYnvEU4AMCDTEZjenJKk7aAmgkPENIOYHfZt9PWfrokJz3DZDIFR4Q/Mem5RwY8Zn0q5crVP777MSM5RSQWRbWLuXzi1PNvTx8y5llCyJlDR3b9/GteRqZUqejUq+e416e6engQQl4ZNLxl+7Y+gQEX408YdPro2HaTZs/0DQwsLSqaPWYCISQr9dbER/sRQr7escXL15f20QMANEwdNTN+z949GzcX5uR6eHt5+vqmJyW9t2J5dPt2JpNp1y+/Hdv9d0VJiaePb5/hQ5+YNEEoFKJmQkM5f1Mid1EW5uWFRbfUa7UZySmrFnzo82NAi7ats2+nfTpjtlGvj2zbRq2qvnzilEAo6NS7JyFk3+Ytv361SiSRtGjbpry4JH7336lXr360do1MISeEXDlzTu7i0rVfn/ysrIRTZ4rz8j/ZsE4ikXbs2SPh1Bm5UhnboxshRCKR0j50AIAGq61mHt25+8dPl4hEoujYdjm305ITEoMjW4S3jGIYZuX8Dy7GH3fxcI+MiclNT9/6w7q8zMw3Plxo3SBqJtSf8zclXr6+3+75i8fjEUL2/r7lt69XnT18pEXb1mcOHDLq9cPGj5kw4w2z2bxwyrTMlNTi3HyJRLr5m++lcvnH69YEhIUwDLP6w/+e2n/w6K49w8aNtm7z47Xf+YUEE0L+M3lqelLy7es3WnWIff6t6QmnzngH+L/58Qe0DxoAoJFqq5nH9uwlhEz/5MPOfXpVlJS+9fSY/MxMHp9/Mf7ExfjjYdEtF363UiKTadTqhVOmnv7n0PAJ48OjW1q3iZoJ9eT8TYlBpzvw57YT+w+W5BUwxEIIKcrNJ4RYGIYQwuPxCSECgYAv4FvPkiaePWc0Gt19vI/s2GXdglatIYTcvnHz7ja9AvytP4S3jk5PSi7MzWvVIZbS8QEhhAgEfLkcX7MAbKC2mkksDCGEx+cTQkQSMcMwZpPZbDJfOn6SECKVy7f+8JN1C9Yxj7QbSXebEtRMKkQiEY9HO0QDOX9T8vX8hYmnznoH+Hcb0K+qvDzh5Gm9TksI6Td82IE/t/298feMlFStWpV+M9nV0yOqfczh7bsIIcV5+X9v2nzvdsQSyYMbF4slhBAz5mrRZjKZtVo97RQAzqC2mjl49NOp165/s/DDNp07Zd9OM5vM7R/pJpXLKkpLCSHJCYnJCYn3bkckFj+4cdRMRzIYDNYv3hzi5E1JZVl54qmznj4+n/+2XiKTJSdeSTh5mmEYQoh/aMiQsaN3/PRL6tVrfD6vdaeOE2a+oXBxkSsVhJAegwa8+fGiRuzRYrHY4TjgIRiG4XHuGwEA+1SUlNZWMx8dPPDwX7uSLidcv3BRJBb3HDJo4sw3CSHWmjl57uyBTz3ZiD2iZtoPwxA+x3oSZ29KjAY9IcTNy1MikxFCUq9cI4SYzRZCSEFO7s71GyJat1q45huRSHT3La07dyCEXDx+8vaNm5Ft2xBC0pNTAkNDrFuog1ShIISUFhQZdDqxVGo0Gu/dLAAA+xlqr5nH/96XdDnh8WefnjR75r1vad2x45mDR/Zv/rNb/z7WqxRTEq9Gd2j/0H2hZtobF7+tOXlT4u7l5eLhnp6U/MkbM4VC4bXzFwkhhVnZDMNUlJQwDKOqqtq9YWNhTi6P8KLax/QbMSwoPLzPsKHH9+778JXXQ1tGmUymvPSM8dNfvzvLtTZu7xqCzQAAIABJREFUnh6+QYFFuXlzxj4vc1EOHfNM/ydHOOpAAQBswMPHp7aaWVpYRAjJycj8Y/WaovwCuVzeuW+vjj0f7RM35MCf23IzMmc9Oz44IqKqvKIoL+/j9T9EtIque1+omfAgro3sNJBQJHr7808i27a5df1mYU7eS++903PIII1ak3M7LapdTLf+/cpLSrf+sO7E3v3H9+776YtlW39cRwh5ef7c0a++7BMYkHXrdml+YevOHcOiIuuzuzc+WhgW3bKyvKy8uFjp5mb/44M7uPiFAICFRLXXzD5xQ8JbRSddurzzl9/OHDh0eMeupbPfSzxzTiKTzV+94rGRI8RSadrNJJ1O22PQAIWLsj67Q820Ky4WRp71ZOF9rl//npCKmJgxNCLVJVurn3cj/Y2IQJtsLTnxiqevj09AACHkwrHjX723oFXHDv9ZvcImG2+QXQWlvb3chvl5On7XzmH//hMpKRnTp0+kHaQGW7aMHj36Au0UYANnz873948OC+tLO0gDpKo0n6VkvxIeYJOtXYw/0aZLJ7lCYbFYNq78dt/vW0ZNeeHZV6bYZOMNsjWvJM7f8zFvd8fvmkM2bNgpFgvHjo2jHeR+KlVBfPxncXE7HnzKyU/f1OH2jZufzXzHqNd7+vgIxaKS/AJCSGyP7rRzQWPo9cayskraKQCc2T9btv6yfIVAKPDy9TMaDOUlJTwer333rrRzQa2qqlQyGcfWSmi+TUlAaPCIiePPHz5amJMrkkgiY9oMGPVEn7hhtHNBY3BxlBKAW2If6f7YyBFXzpwvLSpUuLp26dt72PgxWG6EzbhYGJtvUyJXujzz8uRnXp5MOwjYABf/9gC4xT805KX35tBOAQ1gsTB8PscKo5NPdIVmAk0JAMADGEI4VhjRlIAzQFMCAHAfhiGcq4toSsAZSCRib28P2ikAAFjE1VWpVMppp2gYNCXgDFQqTWVlNe0UAAAsUlpaodcbaKdoGDQl4AwsFgufc/d4AACwJy4WRo7FBaiR2WwRCPA/MwDAv0wms1AooJ2iYVDHwRmYzWaBgGN/ewAAdsXFwoimBJwBF78QAADYFReHkJvv4mngTORyqUwmoZ0CAIBFPD3d5HIZ7RQNw7EeimEYb5GIdgrbk/P5Aq5dTc4qxcXlRqOJdgoA1rEwxEvshF8+FQI+hkYfKiengHaEBuNYUxIkk9yoVtNOYXvZOr2/REw7BYfp9QYJfoEADwiWSZKrtbRT2F62Vu8vxZ/8Q+h0BinXfksca0oEPF5Xd2Wp3kg7iI1JeLxQnH1oAjQlADVSCAXRSlmV040jSvm8MNTMh+FiYeRYU0IImRjqty2/hHYKW9qdX/Kol5u72AlPSzmMwWAU4xcIUJPxIb7b8pyqZm7LKx7i6ynh2nUljsfFwsi9pqSNi+LNiKA16XnVRjPtLE2lN1u255W0cpGPC/alnYXbuDhKCeAYnd1dJoX6rc3I15hqqJkMw9AI1Uh6s2VLblF3d9cnA71pZ+EALo6UcHICVCcP5VtRwRuzi1LU2hhXeYm+5pFJo9HI4/GEQkcfY92L6FVXq11cFDI+L19vcBEKnwzwGuHv5diATsjLy02h4NgtHgAcpqeXm5jP/yOnKE2ja6WUlf3vG53JbDabzI7/3NLr9RJJzSdfzBazyVhDJIWQn6PRe0tET/p7Pe7n6ZCYnBcS4s+5yxI52ZQQQmLdlLFuygqjKUujr7HPP3r0XH5+8fjxwx2ZqrS0/Isv1kml4t69Ow8e3KvG16xbt7VPny4tW4b5ScQ+EpGAc/dwZKXU1MzHH6/5Fw4AhJCuHi5dPVzKDMYcrZ4hPEKIwWBYsODrL76Y4+Aky5evLygoGTVqQM+enWt8QULCzdNHTr/22vj/9yiP8ReLvVEzGyIhIYlzlwRztSmxchcJ3d1qOITKyurhMS2iRj7m6EBuim9LyjIycrefv3Luj32TJj05YkT/+17y9duTTp263MFN6ehsTk2t1ioUHPvbA3A8T7HIUyyy3sMyLb9o//cfOD6DW3nFtQtXNyenFw+4unDh6w++oEO/rk91aq3RaPz9fRwfz5lwsTByb07JQ+l0+vz84qioMCp7d3NTWiwWvd54+3bWkiU/TZr0/okTl+57Tc+enQghc+cupZLQKXHxbw+AlvPnr6alZcfGtqKyd4uFYRhGpVLv3n1s3LjZ16/fevA1rq5Ko9F07VoqjYBOwmKx6PUGmUxKO0jDOFtTYjKZRo2a0bp1C1oBAgN9eTweIYTH46nVmhs3bn344TfTp//3wVeOHDlg06Y9NDI6IbVagzklAPWhVmvXrt1GqyMhhPD5fGuRtFgsqamZc+cu/fXXXQ++LCQk4Nq1W6tWbaSR0Rlw9Ksat0/fPOjWrexdu1ZRDNCiRfB9j2g0upUrFzz4yl69OrduXVFQUKxUKpRKfKA2iUqlwe8Q4KGSk9Pd3JTffbeIYoZ7b1PF4/EKC0vXrPnjzJnEVavur5Pjxg3T6w1cvK6VDThaFZ1qpOTkyct+fp4iquvQBwf73zuxKCDA++TJ32p7sZeXu5+f9/Dhr6pUTrhMrcNotbrgYH/ckA+gbvv3n8jKyqc+UUP8/5e9ZxhGLpc92JFYSSTiI0fOnT6d4Kh0zkOl0oSGBtBO0WDO05R8/PHq4uIyDw83ujF8fDyszalQKPTwcJ08+em6X8/j8Y4d+yUhIclRAZ1QWVklbnwD8FBXrqQMHtyTdgpiMlmsP1gsFolEfPHin/v2ranj9UOG9Lp8+ebevccdFdBJlJVVEMK9K5WcpCkpKCh+4YVRo0YNpB2EdOzYRijk+/h4njmz6cCBtT///FdOTuFD39W7dxdCyPbtBx2S0dmUl1d5eLjSTgHAdnPmTKEdgRBCLBYzwzBeXu6XLm319/fOyMh96Ftef338sGF9HJLOeZSVVXl6cq8wOkNTotFoTSYze8apgoL89u793vrz0qVzP//8x3q+MTk5/erVFHtGc05lZZWenpRHyABYa9euI198sZZ2in8tX/7epUtb9+//gRAyffqE+n8Z+/33v5OS0u2cznlwtDByvinR6fSDB78cHOxPO8i/vvvu30v/W7YMi4oK+e23GuaWP+i9914RCgWYX9JQ5eWV1E/bAbBTRkaun5/33Lkv0Q5Ss/79ux86dCY/v7g+Lx43Lu6ff07iOuF6QlNCx9mzV+4OS7DTzJmTvv/+D7W6XncPb9MmUiQSvfRSzXO+oEYGgyk8PJB2CgDWMRiMXl7u3bu3px2kLlOmPF3/wZIZMya2a9fSzomchEDACwjg3upznG9K+vXr5urK9tVRZ8+evGzZT/V8sUQinj59IuaX1F9qagYWKQG4z40bt196aYGLi4J2kIcYNqzPxo0NW7Hpo4++TUvLtlsiJ3H1aioXh5A53JRcuHBtxozFtFPUy8iRA5KS0uoz49WqY8fW1ju5FBQ41Q3H7SQ/vyQgALcMBfiXVqtLScnYsOFz2kEeTiaTDh/e7+jRc/V/y8KFr//++96cnAJ75uI8jhZGDjclmzfvXb58Lu0U9TVmzLCfftpW/9dbV+KbNm1RcXGZPXM5g7y8osBAX9opAFhELBax4WrEeoqNjT58+GyD3jJv3lRWTSVkofz8ooAA7hVGDjclS5bMEQo5syLtqFEDCwtLq6pUDXrXjh3fXLmSrNPp7ZbLGRQUlPj7c+8LAYCdTJ26iFtLH/Xr17WsrLIRb5w48d2CgnpNkm1urKt2cXFJSU42JRkZuRs27KSdosECA30OHDjd0HcNHPioXm/4+ecd9gnFeeXlldHRYVKphHYQAFY4duz81Kmju3SJoR2kAZRKRXZ2QSNOx/z66+cnTyZoNPW6jKBZKSwsjY2Npp2iMTjZlLz99mf9+nWjnaLBhg7tc+zY+Ua80c3NpbKyChfC1SgtLQf3xQC4q1+/bl27tqOdosFat45o3BokzzwzuKys0mQy2yEUh926lcX+Oc414l5TotFoN2z4jD1LpdVf585tT59OsFgsjXjvjBnPK5Xyxo1wOrfU1MyoqDDaKQBYYezYWbQjNFJISEBeXlHj3hsc7D9ixGuYfnevlJSM6Ohw2ikag3tNiUqlUSo52QASQtq2jbxx43bj3hseHiSXSydMmGPrUNyWkpIZHY2mBIB88cXaGTOep52ikVxc5BUV1Y1++759a3JyCjH97q5bt7KiokJpp2gMjjUlmzbt+eUX7s0muat37y5paTmNfrtUKvnPf17HjanulZqa0bIlJ78QANjW3Lkv9erViXaKRvLx8RAImvR51KlTm6ys/PJyDCcTjJQ4zsWLN15++RnaKRrPYDCUlJQ3ZQutW0cMHNijoqI6Ozvfdrk4TCqVtmyJkRJo7o4ePcfpcYLSUhvc6zs6Onz06FlGo9FGobiqqKi0Y8fWmFPiCEuXznF3595tD++SyaRara6JGxGLRe7uLtOnL8bSateupRoMBpGIM1eGA9jDzp1Hjh07z+lr0Gx1r++DB9fevJlmMDTrvuTSpRvcXeSaS03JxYvX63nfJtby8HC1VeH466+VuKXwxYs3unRpSzsFAGUCAX/OnCm0UzSJQCCw1WpDsbGtrl5NafS0WSdw4cL1rl25dE34vbjUlMyc+am7uwvtFE1y/fotG962cfDgnoSQDRua7xImFy9e59Z6DAD2MHx4P7lcRjtFk+zff8KGd9rr0iVm2rQPmu14yaVLNzp35uq3Nc40JQUFJfPmvSKTSWkHaZLMzLywMBvfz9ZsZg4davCabM4BTQnA2rVbT5y4SDtFkxQUFJvNlqAgPxtuc9eub7Oy8lUqjQ23yQklJeVqtdbmHzQOw5mmxN/fOy6uH+0UNmDzy7RefHGUn19zXGT9ypWULl1iOH0eHaDptmzZ36pVBO0UTXLp0o24uD4232xUVGhaWnZiYrLNt8xmFy5c7d+fe4uL3sWZpuTixetJSWm0UzTJsWPnXVwUrq5Km2/ZOuw5ffonNt8ymx0+fKZbN+6tXAlgQzqdfuXK+T4+nrSDNMnatduGD7fLd87Y2FZff/1LsxovOXDgzKOPdqSdovE405Ts2nU0NTWTdoom+fvv+Li4vvbb/ty5U+bN+9J+22ebo0fPc/oLAUDTSaUSrl8Sf/p0QkCAT3h4kJ22v27dJ8XFZc1nOexjx87379+ddorG40xT0r17+5iYKNopGq+6Wl1QUDJo0KP220VISMB//vOa9QSt/fbCErdvZ4vFwpAQ7t1tAMCGDhw4tX79dtopmmT37qP2Xn0qIiK4uLhsz55jdt0LGxw7dr5v3660UzQJZ5qSuLi+LVqE0E7ReF9++cvTTw+y916sE4FXr96cnNyYW1txyMmTl0aM6E87BQBl+fnFlZUq2ikab+/e4zwer2PHNvbeUatWEWfPXnH69V7Pnr0ybFhv2imahDNNSXz8Be6evsnKyr98+cbIkQMds7sPP3zT6b8TbN689/HHe9FOAUDZqFEDX3xxFO0UjffJJ9/Pnz/NMfv66KPper0xPb3xN/pgOaPRuG3bwcGDuV0YOdOU3Lx5+8iRc7RTNNK3326aN2+qI/c4a9aLhBBnbU3Onr0SFhZoq6WWALjL1VXp5sbV1ZtWrNgwZ84URy704O/vzTDM8uU/O2yPjrRz55Enn3yMdoqm4kxT0rdv18jIYNopGuOPP/Z5eLh269be8bs2GIzbtx90/H7t7cCBU6NGOWjYCYDNzp27+tVXv9BO0RhHjpzNyioYOXKAg/fbokWIn59nVRWHz3nV5vDhsyNHoilxlDZtIgcOtOMsUTspLi5bt27bu+++TGXvTz01KDDQh8qu7aeqSnXo0BmcuwEghHh6up0+nUA7RYPpdPoFC1YsXTqHyt4nTHhCLBYdOHCKyt7t5Pr1W9XV6pgYm62KSwtnmhJCyDffbOTc7R/ffPO/33//AcUAjzzSgRDy6qsfUsxgW7/9tnvChBG0UwCwQlRU6LJlc2mnaLBp0z746Sea6ypJpZLWrSPefvszihlsy2kKI5eakszMvPh4Lq2mPH/+15MnP8WG5X7feefFzZv30k5hG07ztwdgE8HB/rQjNMy8eV8999zw6OhwujFCQgJefXUs3Qy2UlJSfvHi9SFDuH3djRWXmpJp08YqlZy5HfOff+4PDfUfOtT2ayc3QlRUmPXcbUFBCe0sTbJz5+GxY4dx/RZIADb055/7f/6ZM3fl3Lhxd8uWYSz5+LQuz798+c86nZ52libZuvWfV155lnYK2+BSUxIZGfLII7G0U9TL4cNnz569Mm0ai9pw6z1iZs78lNN39P7661+ff/4J2ikAWKR799i//uLGfPYdOw7fupU1efJTtIP8PzNmTBg1ajrtFI2nUqk3bvz72WeH0A5iG1xqSgghGzbsvHYtlXaKh7hyJXnDhp1LltCZw1W3zZuXxcdfsFgstIM0xubNe4cM6eXu7ko7CACLhIYGbNq0lP1/1EePnouPv7Bw4eu0g9xPKBTu27eGEJKdnU87S2OsXr35tddY9AW4iTjWlLRqFfHNNxtpp6hLXl7hzz//RXcOV93GjYuzWCy//babdpAG27Jl/5tvPkc7BQAbabWsPgGRmJh07Nh5ls/JPXDgNOcuZSorq7x8+ea4cXG0g9gMx5qS7t3bv/POFIOBpdfg5OcXv/zywmXL3qUd5CGEQmFhYcmJE1yaNfzdd5uHDOkll8toBwFgHalUMnjwS3q9gXaQml2+fHPFit8WLXqDdpCHmDLl6ePHuVQVCSHLlq2fNOlJ2ilsiWNNiXVmiVgssv7cr9+kefO+op3ojvT0nI8++vbvv7+nHaReZs16kUPTRVUqzeXLN195ZTTtIAAs9e67L91deKNXLxYNKO7Zc2zv3vi1az+mHaRe5s59yXprLdpB6iU5Od1isbDkcgpb4V5TQgiZPfvz/v1f6NLlWZVKI5GIaMch1suVX3/949WrF9EO0gBdusRYp77efeSxxyaPHTuLaqiaLV68xgG3MwTgrpEjB44Y0b9v3+e7dHlWpzOsW7eNdiJCCDl16vL27QfnzXPQ3W1sRSgUfvvtprv/2qvXc2+++V+qiWr2ySffT5zobBP/hbQDNFifPhM1Gh2Px+PxeBaLxWQy005Ebty4vWDB13v3cmOM5D6TJz8VH3+hb9+uw4ZNra5WCYX8U6cSevbsSDvXvy5fvmGxWFhyDSEAC40a9UZJSYVOZyCE8Hg8hmHMZvqF8fDhs9u3H/zxR26MkdzrkUdi7y7UOXjwS3q9MTMz//bt7MhIFt2pfs+eY507t42JiaIdxMa41JTExb2an18kEAh4PN7dBwUCyoM9585dXbny123bVtCN0WgdO7ZWqzVDh04tKSknhJSWVuzadYRVTcmiRd+sXr2QdgoAloqLm1ZUVHbvIwzDCIWUC+PWrQeuX7+1cuV8ujEarXfvLoSQ3r0nWFu9wsKS/ftPvP76eNq57rBYLB988M3583/QDmJ7XDp98/ff33Xo0OruhBIroVBALxHZv//k1q3/bNjwOcUMTTdx4rvWjoQQwufzr1+/lZ/PljXWNm3aM3bssKAgP9pBAFjq009nRUQE8/n/FnMejyeRSChGWr9+e3Jy2sKFr1HM0HRPPTXd2pFYm4B//jlJO9G/Vq3a+Nlnb9NOYRdcakoIIevXfxoX19fFRWH9Vz6fLxJRm1Pyxx/7jh079/nns2kFsImnnpqenV1w7yOFhaUHD7Lizy8vr2jjxj1YVB6gDh06tNq0aUnv3p3vXfCaYlPy5Zc/q9Vazs0juc/YsbPuK4wVFdVHj56jl+hfZ89eSUpK4+IdauuDY00JIWTBgldfe22sv7+3dZSSVow1a7akp+csXsz5XtVoNMlkEoZh7v4yzWbzvn0naOcihJBFi1Z9+OGbtFMAsJ1QKFy+/N3x4+N8fDythdFioTOnZNGiVT4+nm+8waLLfxpHrzdaC+PdR6qqVDt2HKYa6g7nLozca0oIIWPGDFuyZE5IiL91uqvjA3zxxVqGsbz77suO37XN7d69evHit0ePHhoZGeLv7y0UChiGKSoqO3fuCt1gO3YcDgnx79y5Ld0YAFwxbdrYefNeCQ72I4TweBRq+1tvfdqtWzvnuB7kr79WLlr0xpNPDoiICPb19eLxeHw+PyUlMysrj26wVat+Gzt2mLX7dEq8Ggcbrl//npCKmJgxjozye05RskpTbmxAg38rNdPVVenr52XPXPdTqzRanb5NgE+EXPpssI+Yz4HG7s/copvV2mqjqe7FlfQ6vVqjrapUa7U6V1dFEMW7jzLM7dvZkVGhNtxkgFSk4As6uSt7ebnZcLMPtWXL6NGjLzhyj2AnZ8/O9/ePDgvr67A9luqNuwtK83SGwnqvGGkymm7dymrdpoWdo92vtKRcJhKGenu0cZE/E+Tj4L03gsZs3pZbkqnRFRtNdb9Sq9Gp1drqapVOZ2jTNtJRAWtgNBjzC0pCQwNsuM1AidhNJHjU0zXWTWnDzdZNpSqIj/8sLq6GG0my4uqb2yrta4mp/b3dAqWSaGUDJq4+0r2dPXPVrErAcw3x05os6RrtqDPXl7Vr0cZV4fgY9ZSr1U27nNrby81PImqpkDIPGVhyufNPhiE0hqDu9YhnjG03yCMkX2c4XFxxqLhiYesw224cwObOlFWuvJ3b1lURLJOEKxqw1GEvXw975qqZSiRwUcgrTaZUlWbk6Ws/dI72lYgdH6OerlSq5t9If8TDNVAqaaF82DrRHi4PeYEj+dl+jCRHo9+UU3SpQvViGL0vov9Dvym5WaVZcTvnozbhtIPUm/xOdWjrKh/o6/lNet5LYf6d3Nn0f+3/ZGp0nyRlzo0O4cRwjgOEy6WEkGPFFZ8mZ73fypbDMAC2dbS44q/8kumRwbSD1Jv837aph6fb/OvpH7QJD5LRvAioNufKqjZkF85vhW8md0TIpYSQPQWl6zMLqPcllD+rzAzz5e3sMcG+dGM0xbhg369v5Zgs1Kbc1mFpavYzQdw4weRI/XzcGcLszi+lHQSgZhUG44bswnGcLYxSAX9UoPfS1GzaQWpgtFhWpuVOCMEqA/cb7u91s1p9sbyabgzKH1eJlSoRjy+jvQBaUwh5PA+x6Gx5Fe0g90tRaTRmi5uI/mAYC7VQyA4Ul9NOAVCzE2VVAVL2nvuoDy+xqMxoSldraQe538nSKn+O/27tJ0wuPUS7MFLuBrK1hlA5G8f3GiRULs3Q6GinuF+mRhch58wt9xwsSCrWWSy0UwDULE+rD5ZyvjC2VErT1KwrjNk6XTArTyqxQZBUUv6wab/2RrkpqTKZzGw879FgFbT/Qz5IZbKYiFP8cu1AxOens69cAliVGIy0J5rbgMHCqFhwC577lBtYV6vZQ8jn5ej0dDNw+LwJAAAAOBM0JQAAAMAKaEoAAACAFdCUAAAAACugKQEAAABWQFMCAAAArICmBAAAAFgBTQkAAACwApoSAAAAYAU0JQAAAMAKaEoAAACAFZpvU2LQ6/dt3vLL8hW0g8AdFovl2vkLu3/bRDsIQPNVUlCw9cef/vlzO+0gcIdGrT5z6Mipfw7SDuIgnGxKUhKv5mdmN3Ej1ZWVv3616uq5czYKBU2lVas/mzF7/+9/0g4CwD1Go/Hs4SMGfVPvppZ0+cr2teuzb9+2US5oqhsXLq1a8EHCqdO0gzgI95qSn5Ys/+jVN3MzMmgHAQBgi/nPv7Ry/gdGg4F2EIAm4V5TolWraUcAAGAXFEZwDkLaARrmh8VfnNp/kBDy1XsLCCH9nhj+yry5hJCEU6e3/vhTzq00sUzavnu38dNf8/L1tb4lNyPz92++u3kpwWIxR7ZtM3rqy9Ed2j+45RsXL//+zXc56elypbJdty5T5s4WS6UOPz6uMplMu3757djuvytKSjx9fPsMH/rEpAlCoZAQ8sqg4S3bt/UJDLgYf8Kg00fHtps0e6ZvYKD1jRpV9ebVP5w7ckyn0YZFRdI+DgBOmvXs+PKSEkLItMdHEEJeXTi/97DH6y6MdTx1r4Nb/9r7+x+lRcVevj59R8SNfGGiww+Ow2orjBkpqQteeHnouNH5WVmpV66LpZKu/XqPe/01qVxmfWNuRuamVatvXrosEAh9AgNoH4dDcWykJLJtGy9/f0JIdIf2PQY9Ftm2DSHkwrH4Ze+8n5lyq2VsO1cP9zMHD3/86nSNqpoQUpyX/+HUNy6fOOUXHBTWMurmpYTF09+6fePmfZvVqKqXzXkv7WZSm84dA8NCM5JS0JHUH8MwK+d/sPWHdXqdLjImRqNWbf1h3fcfL777gitnzp0+cDi2xyNBLcITTp1ZNvs9k8lkPQv+6YzZh7btMOoNoZEt8jKzqB4HAFd16tVTJJEQQrr269tj0GM+gf51F8Y6nrrXtfMX1i/9srKsrOOjPaRyeWlhIaXj46SHFsZ9v28pzMl9ZGB/iVR6cOuO31assj5ekJP74dQ3Ek6elsrlAaEhOWlp9A6CAo6NlAwY9URSQsKpgoK48WO79utjffC3Fd8yDPPGBwt6DBpgNpuXvfPelTPnDm3b9cSk57atXa+prh7w1JNT5s4mhOz4ecOW737c+sO6uV8uuXezRXkFeq3ONzBwzrIvCCE6jZbS8XHSxfgTF+OPh0W3XPjdSolMplGrF06ZevqfQ8MnjA+Pbml9zcdrv/MLCSaE/Gfy1PSk5NvXb7TqEHt0x+70m8nBkS3mrVzu6uFRWlg4c9QY2kcDwD3Pvz393OGj5Xr9K/PnKlxcrA/WURjreOrezWbfTieEdH+s/9QF76EwNlQdhdH6Ar+QoE/W/yiRyaoqKmY++ezxv/e9OGeWQCD4Y/UaTXV1r6GDX5n3rlAkOv73/ntbGafHsZGSBxVm5xTn5bu6uz0y8DFCiEAg6BM3lBCSlJhACLl27gIh5PFnn7a+uN/wOEJIUkLifRtuKUvyAAAKN0lEQVQJCg/zDQwsystbMmtucuKVu2NoUB+Xjp8khEjl8q0//LRxxbd/rf1ZIpESQtJuJN19jVeAv/WH8NbRhJDC3DxCSMLp04SQuOfGunp4WLdA7yAAnEodhbHumnmv9o90FQgFJ/b9s37p8sLcPBTGBnloYXT18JDIZIQQV3d370B/k9FUXlzMMEzi6TOEkNHTXhaKRIQQmaJ5FUaOjZQ8qKqykhDi6uXF4/Gsj7i4uxFC1JXVhBBVZSUhxN3L685THu6EEINOf98cdZFY/P7K5T9+ujTx9NnE02e79O39xkcLxRIJjQPinorSUkJIckJi8v/v9kRi8YMvFoslhBCz0UgIqSgpI4T4BQU6MCxAs1BHYay7Zt4rOCJi7vIlPy1dfnDrjsN/7X76pRdHTZ7k8EPhqgYVRtGdwmjSadR6rY4vEHj7+zswLItwtSlhGIv1B1c3N0JIVXn53afKi0sJIUp3N0KI0s2tvKSksrxc6eZKCKkoKSGESOWyB/+38AkMeH/lspuXE77/+NOL8ScObdsxbDxOJdSLXKkghEyeO3vgU0826I3uXp6Z//vvBQBNx1gY6w91FMa6a+Z9Yrp1+XzjL8f/3rd+6Vd/rlnbtV+f4BYRDjkUzqujMGakpNb2LplCKZZKDDp9ZVm5m6eH/WOyDvdO38gUCkJIXma2daakb3CQl69vVVn5xfgT1kcO79hJCInp0pkQ0rZrJ0LIkR27rO/d/8c2QkjbLp3vbs1oMFp/sJ5QaNOp4+OjnyaE5Gc3dXG25qN1x46EkP2b/7xb5lISr9bnjWEtowghezZuUlVWEUL0Op2dkwI4rTuFMSvroYWx7pppZfpfYSzIyRUIBP2fGN6+e1dCSHF+Ab1D5JgmFsY/1/xovSDA0MwKI/dGSlq2a3do246tP6y7cCzeoDd8/tv60a++8t1Hn6xcsCiqXbuSgoKS/AK/4KD+T44ghIx8YdKFY8f3/b4l6XIij8dLT0oWisVPvzyZECKVyQghJfkFOWnpgeFhn82YJRKJgyLCkxISCCFtO3eifaCc0SduyIE/t+VmZM56dnxwRERVeUVRXt7H63+IaBVd9xuHjh9zYOuO9JvJbz09JiAstDA711GRAZxNy9h2eZmZS2e96xcSFBIZOXX+u7UVRh6PV0fNtE4cSUpI1Gm0FWVlc8dOjGzX1tXd/cqZs0KxOCw6ivaBckajC+PTL0/+fOY7R3bsvnj8pJefb/at5rW6LvdGSnoOGfT46GfkSkXOrTSlqyshpPewx9/8+IOg8PBb165pVOqeQwbN//Zr6+SgwPDQBd+uaNetS35WVm5GRtsunRZ8+7X1khCFi0u3/v2Ubq63b9zUa3VtOneqLC+7fPKkwtVl0qwZPQYNoH2gnCGRyeavXvHYyBFiqTTtZpJOp+0xaIDCRfnQN7q6u8/75su2XTubzZayouLOfXo6JC+AExrz6isde/Ywm035mVlunu51F8Y6nmrdqaNfSBAhpCAnx2wyxXTrkpmSeu38hfDo6HeWfurp40P7QDmj0YWxffdub368KKhFhKa6WlOt6vBoD4fkZQsewzAPPnr9+veEVMTE2H1SxYbswmyNfoCPu713ZFeny6rEfN4bLYJoB/l/tueVJFaphvp60g7CUgtvZvzTK9YBO9qyZfTo0RccsCOwt7Nn5/v7R4eF9bX3jhYnZ3qJRR3cHv4BxmZ/F5Z2cXcZGeBNO8j/s+J2DkPIIx6utIOwUanB+Htu0YYubey9I5WqID7+s7i4HQ8+xb3TN2AnX7+/UKfV1PiUT0BAcX5+jU9JZfKZn35k2yQpV69tX7u+xqfMZrPZZKrtwqjZSz61XkQHANB0DMN88fac2p7l8Wr+Vk8IiW7f7qmXXrRtmEPbd144Fl/jUyKR2Gis+bZHMoVixicf2jaJXaEpgTuuX7ioUalqfKpVx9jkhCs1PiVT2P7LXFVZ+dWz52t8imGYu9cx1viszcMAQHNWWy0ihLh7e1uv6HyQPVaUyMvIrC2MVCav7SulS00XVbEZmhK4Y82BPbQj3NG1X59fTx+jnQIAmjsej8eeWvT829Off3s67RR2x72JrgAAAOCU0JQAAAAAK6ApAQAAAFZAUwIAAACsgKYEAAAAWAFNCQAAALACmhIAAABgBTQlAAAAwApoSgAAAIAVKDclfIbwa100nDP41H+PNeHx2JiKPWR8PpalB3YS8Z2gLhIhj8cjrDsOdqZiCT6PJ+bR7gro7t5TLKoymuhmaLoKo8lHIqad4n6eImEl93+3dlJpNEkF/DpuowNAkbtQUGk0007RVGUGk7eYdXcycY4PHTupMJhchAK6GSg3JeFyicZsoZuh6VQmUwuFlHaK+4XLJVru/27tpFhvaOcqp50CoGZRSpnKxPkPTo3JEi5nXWFsoZCqzZxv+OykRG9s50K5MFJuStq4KmQCflJ1zbc35IQ0tdbIkM7uLrSD3C9ULguSSi5XVNMOwkb/FJVPCPGjnQKgZo/5eGRq9cX6mm9GzwnnyqqilNJAme1vlttE3T1cq4zmbI2OdhA22ldUNjHUn24G+rMOPmkbcb6i+nqVmnaQxkiu1pwsrfyiXQvaQWo2v3XYbbXuUjn6kv/np8yCGZFBLRQy2kEAarUiNmpPQWmOVk87SGOcL6su0BvmRofSDlKzpe1bHC6puKXS0g7CIlqzeW1G/ooOUVIB5a6A/gk/Po/3TYeWi25mnC6rdBeJ5LRPaNWT0czk6/UBUvGKDi0FLJ6asLR95KfJWeszC+QCgZuIb27GM7wUQkG6Sivi8yaG+HX1cKUdB6AuriLh0vZRi26mlxhM4XIpJ6ZkC3mkwmDSW5hwuXRxDEu/qhFCZALB17FR82+k7y8qC5FJhc4wq7jxFAJ+qkqrEPDnRIdEsuCrGv2mxOrDNuHZGt1tta6MI1OQPMXCFnJpKPvOmD7o/VahORpdukZfYjByorTZiUzAj/P1bOsqZ3MTCXCXUihY1j4qQ61L0+gquFAYBTziJRZFyCVBMrYXRjGfv6RdZJZGl8adDx07cREKRgZ4t1LKWDLxny1NCSEkRC4N4cJnPBcFy6XB+N0CcFC4QhrOvnn0ziGUI18smxX6c0oAAAAA0JQAAAAAW6ApAQAAAFZAUwIAAACsgKYEAAAAWAFNCQAAALACmhIAAABgBTQlAAAAwApoSgAAAIAV0JQAAAAAK6ApAQAAAFZAUwIAAACsgKYEAAAAWAFNCQAAALACmhIAAABgBTQlAAAAwApoSgAAAIAV0JQAAAAAK6ApAQAAAFZAUwIAAACsgKYEAAAAWAFNCQAAALCCsLYnCgoSjEaNY8MAALBaVtbx8vI02ikAuK2O7qLmpsTfv6dY7GLPSADNRceOrWhHANsICxtRXZ1OOwWAM/Dx6Vvj4zyGYRweBgAAAOB+mFMCAAAArICmBAAAAFgBTQkAAACwApoSAAAAYAU0JQAAAMAKaEoAAACAFf4P9ZPIILnCkSQAAAAASUVORK5CYII=",
+ "text/plain": [
+ ""
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ }
+ ],
+ "source": [
+ "# Create graph\n",
+ "web_research_graph = StateGraph(ResearchState)\n",
+ "\n",
+ "# Add nodes\n",
+ "web_research_graph.add_node(\"Searcher\", search_node)\n",
+ "web_research_graph.add_node(\"WebScraper\", web_scraping_node)\n",
+ "web_research_graph.add_node(\"Supervisor\", supervisor_agent)\n",
+ "\n",
+ "# Add edges\n",
+ "web_research_graph.add_edge(\"Searcher\", \"Supervisor\")\n",
+ "web_research_graph.add_edge(\"WebScraper\", \"Supervisor\")\n",
+ "\n",
+ "# Define conditional edges: move to next node based on Supervisor's decision\n",
+ "web_research_graph.add_conditional_edges(\n",
+ " \"Supervisor\",\n",
+ " get_next_node,\n",
+ " {\"Searcher\": \"Searcher\", \"WebScraper\": \"WebScraper\", \"FINISH\": END},\n",
+ ")\n",
+ "\n",
+ "# Set entry point\n",
+ "web_research_graph.set_entry_point(\"Supervisor\")\n",
+ "\n",
+ "# Compile graph\n",
+ "web_research_app = web_research_graph.compile(checkpointer=MemorySaver())\n",
+ "\n",
+ "# Visualize graph\n",
+ "visualize_graph(web_research_app, xray=True)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "vLKMtsgtLMXh"
+ },
+ "source": [
+ "Let's run the `web_research_app`:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 14,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import uuid\n",
+ "from typing import Dict, Any\n",
+ "from langchain_core.runnables import RunnableConfig\n",
+ "from langchain_core.messages import HumanMessage\n",
+ "from langchain_core.messages import BaseMessage\n",
+ "\n",
+ "\n",
+ "def random_uuid() -> str:\n",
+ " \"\"\"Generate a random UUID string.\"\"\"\n",
+ " return str(uuid.uuid4())\n",
+ "\n",
+ "\n",
+ "def pretty_print_message(message: BaseMessage) -> None:\n",
+ " \"\"\"Print message with role-based formatting.\"\"\"\n",
+ " role = \"user\" if isinstance(message, HumanMessage) else \"assistant\"\n",
+ " print(f\"\\n{role}: {message.content}\")\n",
+ "\n",
+ "\n",
+ "def format_namespace(namespace: list) -> str:\n",
+ " \"\"\"Format namespace for display.\"\"\"\n",
+ " return namespace[-1].split(\":\")[0] if namespace else \"root graph\"\n",
+ "\n",
+ "\n",
+ "def display_node_output(node_name: str, namespace: list, chunk: Dict[str, Any]) -> None:\n",
+ " \"\"\"Display formatted node output.\"\"\"\n",
+ " print(\"\\n\" + \"=\" * 50)\n",
+ " formatted_namespace = format_namespace(namespace)\n",
+ "\n",
+ " if formatted_namespace == \"root graph\":\n",
+ " print(f\"🔄 Node: \\033[1;36m{node_name}\\033[0m 🔄\")\n",
+ " else:\n",
+ " print(\n",
+ " f\"🔄 Node: \\033[1;36m{node_name}\\033[0m in [\\033[1;33m{formatted_namespace}\\033[0m] 🔄\"\n",
+ " )\n",
+ " print(\"- \" * 25)\n",
+ "\n",
+ " for k, v in chunk.items():\n",
+ " if isinstance(v, BaseMessage):\n",
+ " pretty_print_message(v)\n",
+ " elif isinstance(v, list):\n",
+ " for item in v:\n",
+ " if isinstance(item, BaseMessage):\n",
+ " pretty_print_message(item)\n",
+ " else:\n",
+ " print(item)\n",
+ " elif isinstance(v, dict):\n",
+ " for key, value in v.items():\n",
+ " print(f\"{key}:\\n{value}\")\n",
+ " print(\"=\" * 50)\n",
+ "\n",
+ "\n",
+ "def run_graph(app, message: str, recursive_limit: int = 50):\n",
+ " # Set configuration\n",
+ " config = RunnableConfig(\n",
+ " recursion_limit=recursive_limit, configurable={\"thread_id\": random_uuid()}\n",
+ " )\n",
+ "\n",
+ " # Prepare input\n",
+ " inputs = {\n",
+ " \"messages\": [HumanMessage(content=message)],\n",
+ " }\n",
+ "\n",
+ " # Execute graph and display output\n",
+ " for namespace, chunk in app.stream(\n",
+ " inputs, config, stream_mode=\"updates\", subgraphs=True\n",
+ " ):\n",
+ " for node_name, node_chunk in chunk.items():\n",
+ " display_node_output(node_name, namespace, node_chunk)\n",
+ "\n",
+ " return app.get_state(config).values"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 15,
+ "metadata": {
+ "id": "pUIWVFjOLQoC"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mWebScraper\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mWebScraper\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: \n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Yahoo Finance - Stock Market Live, Quotes, Business & Finance News \n",
+ "\n",
+ "\n",
+ "\n",
+ " News Today's news US Politics World Tech Reviews and deals Audio Computing Gaming Health Home Phones Science TVs Climate change Health Science Originals The 360 Newsletters Life Health COVID-19 Fall allergies Health news Mental health Relax Sexual health Studies The Unwind Parenting Family health So mini ways Style and beauty It Figures Unapologetically Horoscopes Shopping Buying guides Food Travel Autos Gift ideas Buying guides Entertainment Celebrity TV Movies Music How to Watch Interviews Videos Finance My Portfolio News Latest Stock Market Originals The Morning Brief Economies Earnings Tech Housing Crypto Biden Economy Markets Stocks: Most Actives Stocks: Gainers Stocks: Losers Trending Tickers Futures World Indices US Treasury Bonds Rates Currencies Crypto Top ETFs Top Mutual Funds Options: Highest Open Interest Options: Highest Implied Volatility Sectors Basic Materials Communication Services Consumer Cyclical Consumer Defensive Energy Financial Services Healthcare Industrials Real Estate Technology Utilities Research Screeners Calendar Stock Comparison Advanced Chart Currency Converter Personal Finance Credit Cards Banking Best HYSA Student Loans Personal Loans Insurance Mortgages Mortgage Calculator Taxes Videos Latest Editor's Picks Investing Insights Trending Stocks All Shows Morning Brief Opening Bid Wealth Invest ETF Report YF Awards 2024 Streaming Now Sports Fantasy News Fantasy football Best Ball Pro Pick 'Em College Pick 'Em Fantasy baseball Fantasy hockey Fantasy basketball Download the app Daily fantasy NFL News Scores and schedules Standings Stats Teams Players Drafts Injuries Odds Super Bowl GameChannel Videos MLB News Scores and schedules Standings Stats Teams Players Odds Videos World Baseball Classic NBA News Draft Scores and schedules Standings Stats Teams Players Injuries Videos Odds Playoffs NHL News Scores and schedules Standings Stats Teams Players Odds Playoffs Soccer News Scores and schedules Premier League MLS NWSL Liga MX CONCACAF League Champions League La Liga Serie A Bundesliga Ligue 1 World Cup College football News Scores and schedules Standings Rankings Stats Teams Show all MMA WNBA Sportsbook NCAAF Tennis Golf NASCAR NCAAB NCAAW Boxing USFL Cycling Motorsports Olympics Horse racing GameChannel Rivals Newsletters Podcasts Videos RSS Jobs Help World Cup More news New on Yahoo Creators Games Tech Terms Privacy Privacy Dashboard Feedback © 2025 All rights reserved. About our ads Advertising Careers Yahoo Finance Yahoo Finance Search query Select edition USEnglish US y LATAMEspañol AustraliaEnglish CanadaEnglish CanadaFrançais DeutschlandDeutsch FranceFrançais 香港繁中 MalaysiaEnglish New ZealandEnglish SingaporeEnglish 台灣繁中 UKEnglish News Finance Sports More News Today's news US Politics World Weather Climate change Health Science Originals Newsletters Life Health Parenting Style and beauty Horoscopes Shopping Food Travel Autos Gift ideas Buying guides Entertainment Celebrity TV Movies Music How to watch Interviews Videos Finance My portfolio Watchlists Markets News Videos Screeners Personal finance Crypto Industries Sports Fantasy NFL NBA MLB NHL College football College basketball Soccer MMA Yahoo Sports AM New on Yahoo Creators Games Tech Selected edition USEnglish Mail Sign in My Portfolio News Latest Stock Market Originals The Morning Brief Economies Earnings Tech Housing Crypto Biden Economy Markets Stocks: Most Actives Stocks: Gainers Stocks: Losers Trending Tickers Futures World Indices US Treasury Bonds Rates Currencies Crypto Top ETFs Top Mutual Funds Options: Highest Open Interest Options: Highest Implied Volatility Sectors Basic Materials Communication Services Consumer Cyclical Consumer Defensive Energy Financial Services Healthcare Industrials Real Estate Technology Utilities Research Screeners Calendar Stock Comparison Advanced Chart Currency Converter Personal Finance Credit Cards Banking Best HYSA Student Loans Personal Loans Insurance Mortgages Mortgage Calculator Taxes Videos Latest Editor's Picks Investing Insights Trending Stocks All Shows Morning Brief Opening Bid Wealth Invest ETF Report YF Awards 2024 Streaming Now … BofA chief: The Fed may have to respond to Trump Bank of America CEO Brian Moynihan says the Fed needs to start asking itself a different question. Yahoo Finance • 2 hours ago PNC +0.59% BAC +0.51% Stocks rise as Trump holds off on tariff bazooka LIVE Yahoo Finance • 41 seconds ago ES=F +0.53% ^GSPC +0.39% Elon Musk stamps place in Trump's orbit during inauguration Associated Press Finance • 13 hours ago Apple's tough January worsens as analysts cut on iPhone weakness Bloomberg • 1 hour ago AAPL -3.44% Trump eyes historic 25% tariffs on Canada and Mexico Yahoo Finance • 12 hours ago Trump, Melania coins are 'gambling tokens': Anthony Scaramucci Yahoo Finance • 1 hour ago HOOD +0.97% DJT -9.34% Bitcoin holds retreat as Trump omits crypto from Day 1 orders Bloomberg • 2 hours ago Apple's iPhone sales in China plunged 18% in holiday quarter Bloomberg • 9 hours ago AAPL -3.44% Trump unleashes a wave of first-day orders Yahoo Finance • 12 hours ago DJT -9.34% DJTWW -12.70% Schwab rises after earnings top estimates on record inflows Bloomberg • 1 hour ago SCHW +6.60% View More Latest Exclusive-International Paper to win EU approval for $7.1 billion DS Smith deal Reuters • 22 minutes ago IP +1.10% DSSMY 0.00% US energy mergers may slow in 2025 as deal sizes shrink, says Enverus Reuters • 25 minutes ago France Gets Record Demand for Bond Sale After Long Absence Bloomberg • 35 minutes ago Goldman Sachs Promotes Its Next Generation of Leaders The Wall Street Journal • 37 minutes ago GSC +0.68% Goldman Sachs shuffles management to strengthen Wall Street businesses Reuters • 37 minutes ago GSC +0.68% The Risks of Quantum Computing to Cryptocurrency, Bitcoin, and Blockchain TheStreet • 38 minutes ago Climate shift to help boost StanChart income by almost $1 billion, CEO says Reuters • 38 minutes ago SCBFF 0.00% Milei ‘The Madman’ Is Back in Davos After Racking Up Victories in Argentina Bloomberg • 40 minutes ago Popular Goldman Picks Leaders to Run Wall Street Engines in Major Revamp Bloomberg • 1 hour ago GSC +0.68% Kohl’s Shoppers Want Its New CEO to Undo What the Last One Did The Wall Street Journal • yesterday ETSY -0.23% KSS +3.73% Guggenheim expects Fed to cut rates about every quarter in 2025 Reuters • 21 hours ago ^GSPC +0.39% Who Was Better For The Stock Market? Biden Or Trump? Investor's Business Daily • 16 hours ago Stock market today: Dow, S&P 500, Nasdaq rise with investors on Trump tariff watch LIVE Yahoo Finance • 41 seconds ago ES=F +0.53% ^GSPC +0.39% Weekly Stock List Argus Research • 2 hours ago ILMN +0.76% MS -1.17% Bitcoin Hits Record, Dollar Falls on Trump’s Inauguration Day The Wall Street Journal • 20 hours ago ^HSI +0.91% Stocks rise on as Trump starts his first full day in office: Yahoo Finance It's a big day for both Washington and Wall Street. President Donald Trump is kicking off his first full day in office. Trump announced a number of executive actions last night, including the creation of the Elon Musk-led Department of Government Efficiency. One thing Trump did not do was enact new tariffs, though he did threaten to impose tariffs of 25% on goods from Mexico and Canada on February 1. The delay has some investors breathing a sigh of relief, with stocks moving higher in early trading. Some of the trending tickers on Yahoo Finance include Tesla (TSLA), Nvidia (NVDA), and Apple (AAPL). 3d ago 02:52 Trump signs executive actions Yahoo Finance Video • 13h ago 07:52 Trump 2.0 is 'great news' for small & midsize businesses: BofA CEO Yahoo Finance Video • 2h ago BAC +0.51% 01:35 Trump executive actions, tariffs, bitcoin slips: 3 Things Yahoo Finance Video • 28m ago ES=F +0.53% YM=F +0.47% 01:49 Fed 'has to think about' inflation and Trump's proposed tariffs Yahoo Finance Video • 1h ago BAC +0.51% View More Investment Ideas Build Your Wealth View More More News Copyright © 2025 Yahoo. All rights reserved. POPULAR QUOTES Dow Jones S&P 500 DAX Index Nvidia Tesla DJT EXPLORE MORE Mortgages Credit Cards Sectors Crypto Heatmap Biden Economy Financial News ABOUT Data Disclaimer Help Feedback Sitemap Licensing What's New About Our Ads Terms and Privacy Policy Privacy Dashboard U.S. markets close in 6h 20m US Europe Asia Rates Commodities Currencies Cryptocurrencies S&P 500 6,019.81 +23.15 (+0.39%) Dow 30 43,684.00 +196.17 (+0.45%) Nasdaq 19,635.90 +5.70 (+0.03%) Russell 2000 2,275.88 +9.09 (+0.40%) Crude Oil 76.08 -1.80 (-2.31%) Gold 2,742.20 -6.50 (-0.24%) My Portfolios My Portfolios Sign in to access your portfolio Sign in Top Gainers TEM Tempus AI, Inc 43.07 +7.92 (+22.53%) RGTI Rigetti Computing, Inc. 12.02 +2.19 (+22.26%) RKLB Rocket Lab USA, Inc. 27.76 +3.76 (+15.67%) GH Guardant Health, Inc. 41.56 +4.90 (+13.37%) OKLO Oklo Inc. 29.50 +3.45 (+13.24%) Top Losers FTAI FTAI Aviation Ltd. 81.26 -31.12 (-27.69%) EDU New Oriental Education & Technology Group Inc. 45.67 -15.17 (-24.96%) DJTWW Trump Media & Technology Group Corp. 24.80 -4.18 (-14.42%) DJT Trump Media & Technology Group Corp. 35.32 -4.71 (-11.76%) IREN IREN Limited 10.80 -1.19 (-9.92%) Most Active RGTI Rigetti Computing, Inc. 12.02 +2.19 (+22.26%) NVDA NVIDIA Corporation 139.35 +1.64 (+1.19%) INTC Intel Corporation 22.22 +0.73 (+3.37%) TSLA Tesla, Inc. 417.27 -9.23 (-2.16%) AAPL Apple Inc. 222.14 -7.84 (-3.41%) Trending Tickers AAPL Apple Inc. 222.12 -7.86 (-3.42%) TSLA Tesla, Inc. 414.67 -11.83 (-2.77%) NVDA NVIDIA Corporation 137.84 +0.12 (+0.09%) PLTR Palantir Technologies Inc. 72.45 +0.68 (+0.95%) TEM Tempus AI, Inc 42.78 +7.62 (+21.69%) Top Economic Events Terms and Privacy Policy Privacy Dashboard Ad Terms Feedback \n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mWebScraper\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: Here are the main news highlights from Yahoo Finance:\n",
+ "\n",
+ "1. **Federal Reserve and Trump**: Bank of America CEO Brian Moynihan suggests that the Federal Reserve may need to reconsider its approach in response to actions by former President Trump. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "2. **Stock Market Movements**: Stocks have risen as Trump holds off on implementing significant tariffs. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "3. **Elon Musk and Trump**: Elon Musk has positioned himself within Trump's circle during the inauguration. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "4. **Apple's Challenges**: Apple's January performance has worsened due to analysts cutting forecasts based on iPhone sales weakness. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "5. **Trump's Tariff Plans**: Trump is considering imposing historic 25% tariffs on Canada and Mexico. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "6. **Cryptocurrency and Trump**: Bitcoin has retreated as Trump did not include cryptocurrency in his initial executive orders. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "7. **Schwab's Earnings**: Schwab's earnings have exceeded estimates due to record inflows. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "8. **International Paper Deal**: International Paper is set to receive EU approval for a $7.1 billion deal with DS Smith. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "9. **US Energy Mergers**: There is an expectation that US energy mergers may slow down in 2025 as deal sizes shrink. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "10. **Goldman Sachs Leadership**: Goldman Sachs is promoting its next generation of leaders and shuffling management to strengthen its Wall Street businesses. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "These summaries provide a snapshot of the current financial news and market trends as reported by Yahoo Finance.\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mWebScraper\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "user: Here are the main news highlights from Yahoo Finance:\n",
+ "\n",
+ "1. **Federal Reserve and Trump**: Bank of America CEO Brian Moynihan suggests that the Federal Reserve may need to reconsider its approach in response to actions by former President Trump. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "2. **Stock Market Movements**: Stocks have risen as Trump holds off on implementing significant tariffs. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "3. **Elon Musk and Trump**: Elon Musk has positioned himself within Trump's circle during the inauguration. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "4. **Apple's Challenges**: Apple's January performance has worsened due to analysts cutting forecasts based on iPhone sales weakness. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "5. **Trump's Tariff Plans**: Trump is considering imposing historic 25% tariffs on Canada and Mexico. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "6. **Cryptocurrency and Trump**: Bitcoin has retreated as Trump did not include cryptocurrency in his initial executive orders. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "7. **Schwab's Earnings**: Schwab's earnings have exceeded estimates due to record inflows. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "8. **International Paper Deal**: International Paper is set to receive EU approval for a $7.1 billion deal with DS Smith. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "9. **US Energy Mergers**: There is an expectation that US energy mergers may slow down in 2025 as deal sizes shrink. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "10. **Goldman Sachs Leadership**: Goldman Sachs is promoting its next generation of leaders and shuffling management to strengthen its Wall Street businesses. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "These summaries provide a snapshot of the current financial news and market trends as reported by Yahoo Finance.\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n"
+ ]
+ }
+ ],
+ "source": [
+ "output = run_graph(\n",
+ " web_research_app,\n",
+ " \"Please summarize the main news from https://finance.yahoo.com/ and include the sources (URLs).\",\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 16,
+ "metadata": {
+ "id": "z6fQqA6VLT0Y"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "Here are the main news highlights from Yahoo Finance:\n",
+ "\n",
+ "1. **Federal Reserve and Trump**: Bank of America CEO Brian Moynihan suggests that the Federal Reserve may need to reconsider its approach in response to actions by former President Trump. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "2. **Stock Market Movements**: Stocks have risen as Trump holds off on implementing significant tariffs. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "3. **Elon Musk and Trump**: Elon Musk has positioned himself within Trump's circle during the inauguration. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "4. **Apple's Challenges**: Apple's January performance has worsened due to analysts cutting forecasts based on iPhone sales weakness. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "5. **Trump's Tariff Plans**: Trump is considering imposing historic 25% tariffs on Canada and Mexico. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "6. **Cryptocurrency and Trump**: Bitcoin has retreated as Trump did not include cryptocurrency in his initial executive orders. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "7. **Schwab's Earnings**: Schwab's earnings have exceeded estimates due to record inflows. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "8. **International Paper Deal**: International Paper is set to receive EU approval for a $7.1 billion deal with DS Smith. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "9. **US Energy Mergers**: There is an expectation that US energy mergers may slow down in 2025 as deal sizes shrink. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "10. **Goldman Sachs Leadership**: Goldman Sachs is promoting its next generation of leaders and shuffling management to strengthen its Wall Street businesses. [Source](https://finance.yahoo.com/)\n",
+ "\n",
+ "These summaries provide a snapshot of the current financial news and market trends as reported by Yahoo Finance.\n"
+ ]
+ }
+ ],
+ "source": [
+ "# Print final result\n",
+ "print(output[\"messages\"][-1].content)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "MM3pPUsaLVqP"
+ },
+ "source": [
+ "### Document Writing Team\n",
+ "Now let's create the document writing team. Here, we'll grant different file-writing tool access to each agent."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 17,
+ "metadata": {
+ "id": "NIWgweiPLg0j"
+ },
+ "outputs": [],
+ "source": [
+ "import operator\n",
+ "from typing import List, TypedDict, Annotated\n",
+ "from pathlib import Path\n",
+ "\n",
+ "# Create temporary directory and set working directory\n",
+ "WORKING_DIRECTORY = Path(\"./tmp\")\n",
+ "WORKING_DIRECTORY.mkdir(exist_ok=True) # Create tmp folder if it doesn't exist\n",
+ "\n",
+ "\n",
+ "# Define state\n",
+ "class DocWritingState(TypedDict):\n",
+ " messages: Annotated[List[BaseMessage], operator.add]\n",
+ " team_members: str\n",
+ " next: str\n",
+ " current_files: str # Currently working files\n",
+ "\n",
+ "\n",
+ "# State preprocessing node: Helps each agent better recognize current working directory state\n",
+ "def preprocess(state):\n",
+ " # Initialize list of written files\n",
+ " written_files = []\n",
+ "\n",
+ " try:\n",
+ " # Search all files in working directory and convert to relative paths\n",
+ " written_files = [\n",
+ " f.relative_to(WORKING_DIRECTORY) for f in WORKING_DIRECTORY.rglob(\"*\")\n",
+ " ]\n",
+ " except Exception:\n",
+ " pass\n",
+ "\n",
+ " # Add \"No files written.\" to state if no files exist\n",
+ " if not written_files:\n",
+ " return {**state, \"current_files\": \"No files written.\"}\n",
+ "\n",
+ " # Add list of written files to state\n",
+ " return {\n",
+ " **state,\n",
+ " \"current_files\": \"\\nBelow are files your team has written to the directory:\\n\"\n",
+ " + \"\\n\".join([f\" - {f}\" for f in written_files]),\n",
+ " }\n",
+ "\n",
+ "\n",
+ "# Initialize LLM\n",
+ "llm = ChatOpenAI(model=MODEL_NAME)\n",
+ "\n",
+ "# Create document writing agent\n",
+ "doc_writer_agent = create_react_agent(\n",
+ " llm,\n",
+ " tools=[write_document, edit_document, read_document],\n",
+ " state_modifier=\"You are a arxiv researcher. Your mission is to write arxiv style paper on given topic/resources.\",\n",
+ ")\n",
+ "context_aware_doc_writer_agent = preprocess | doc_writer_agent\n",
+ "doc_writing_node = agent_factory.create_agent_node(\n",
+ " context_aware_doc_writer_agent, name=\"DocWriter\"\n",
+ ")\n",
+ "\n",
+ "# Create note taking node\n",
+ "note_taking_agent = create_react_agent(\n",
+ " llm,\n",
+ " tools=[create_outline, read_document],\n",
+ " state_modifier=\"You are an expert in creating outlines for research papers. Your mission is to create an outline for a given topic/resources or documents.\",\n",
+ ")\n",
+ "context_aware_note_taking_agent = preprocess | note_taking_agent\n",
+ "note_taking_node = agent_factory.create_agent_node(\n",
+ " context_aware_note_taking_agent, name=\"NoteTaker\"\n",
+ ")\n",
+ "\n",
+ "# Create chart generating agent\n",
+ "chart_generating_agent = create_react_agent(\n",
+ " llm, tools=[read_document, python_repl_tool]\n",
+ ")\n",
+ "context_aware_chart_generating_agent = preprocess | chart_generating_agent\n",
+ "chart_generating_node = agent_factory.create_agent_node(\n",
+ " context_aware_chart_generating_agent, name=\"ChartGenerator\"\n",
+ ")\n",
+ "\n",
+ "# Create document writing team supervisor\n",
+ "doc_writing_supervisor = create_team_supervisor(\n",
+ " MODEL_NAME,\n",
+ " \"You are a supervisor tasked with managing a conversation between the\"\n",
+ " \" following workers: ['DocWriter', 'NoteTaker', 'ChartGenerator']. Given the following user request,\"\n",
+ " \" respond with the worker to act next. Each worker will perform a\"\n",
+ " \" task and respond with their results and status. When finished,\"\n",
+ " \" respond with FINISH.\",\n",
+ " [\"DocWriter\", \"NoteTaker\", \"ChartGenerator\"],\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "dCTcop8WLtmt"
+ },
+ "source": [
+ "### Creating Doc Writing Team Graph\n",
+ "\n",
+ "Integrates document writing, note-taking, and chart generation into a unified flow."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 18,
+ "metadata": {
+ "id": "EVdwBalILuGx"
+ },
+ "outputs": [],
+ "source": [
+ "# Create graph\n",
+ "authoring_graph = StateGraph(DocWritingState)\n",
+ "\n",
+ "# Define nodes\n",
+ "authoring_graph.add_node(\"DocWriter\", doc_writing_node)\n",
+ "authoring_graph.add_node(\"NoteTaker\", note_taking_node)\n",
+ "authoring_graph.add_node(\"ChartGenerator\", chart_generating_node)\n",
+ "authoring_graph.add_node(\"Supervisor\", doc_writing_supervisor)\n",
+ "\n",
+ "# Define edges\n",
+ "authoring_graph.add_edge(\"DocWriter\", \"Supervisor\")\n",
+ "authoring_graph.add_edge(\"NoteTaker\", \"Supervisor\")\n",
+ "authoring_graph.add_edge(\"ChartGenerator\", \"Supervisor\")\n",
+ "\n",
+ "# Define conditional edges: Move to next node based on Supervisor node's decision\n",
+ "authoring_graph.add_conditional_edges(\n",
+ " \"Supervisor\",\n",
+ " get_next_node,\n",
+ " {\n",
+ " \"DocWriter\": \"DocWriter\",\n",
+ " \"NoteTaker\": \"NoteTaker\",\n",
+ " \"ChartGenerator\": \"ChartGenerator\",\n",
+ " \"FINISH\": END,\n",
+ " },\n",
+ ")\n",
+ "\n",
+ "# Set entry point\n",
+ "authoring_graph.set_entry_point(\"Supervisor\")\n",
+ "\n",
+ "# Compile graph\n",
+ "authoring_app = authoring_graph.compile(checkpointer=MemorySaver())"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "wwvokJqgLwSO"
+ },
+ "source": [
+ "Let's visualize the graph:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 19,
+ "metadata": {
+ "id": "oax44hlALyP_"
+ },
+ "outputs": [
+ {
+ "data": {
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAABAoAAAIJCAIAAACjgkjvAAAAAXNSR0IArs4c6QAAIABJREFUeJzs3Wdck+fCBvAnEPaeIhvC3oiIe4HbWmeVYrUV69ZqteqLq3V10B63tbZqbd2r7llwgCIiCLIhTBGQGSBAQtb74fFwKNYNuTOu/wd/IWZcUiu5nnsxJBIJBQAAAAAAQFEqpAMAAAAAAICsQD0AAAAAAIDnUA8AAAAAAOA51AMAAAAAAHgO9QAAAAAAAJ5DPQAAAAAAgOdUv/76a9IZAABARt14VpxUV9kiEf1dWcJkqJhraD2qq5Tx2xqqTFN1zYz6mloBz0Rdi/S3EABAzmD0AAAA/uFSeeGy1FiOgF8n4CdyKqr5zXWClkahoE7Ir27hyf5tTgu/VsDP5nKOl+ReeVYkkkjS66tJf1MBAOQGA8eiAQAA7UlTg4mG1v7CjG6GZq56RqTjdJgbFcVxNeXfuPUwVNcknQUAQNahHgAAAFXazP2JnTzbwctEQT9ANwuFAkqspaLaLBQ66BqQjgMAILswuQgAQNmJKUkml7PMyV9RuwFFUVpMpj5TXZWh8mtRBpvLIR0HAEB2YfQAAECpnSjJHWJuK6aU6GdBIqcixMxGhcEgHQQAQBZh9AAAQHmdesrmCPhK1Q0oigowNL9fU57XWEc6CACALMLoAQCAkpJQVJ2AL1LWnwL7CzMGm1kHGJmTDgIAIFtQDwAAlFFtCy+qsmSQmTXpICTVCVoctPUYmGUEANAGJhcBACijbXmPnXWUfQMfdRWVipZm0ikAAGQL6gEAgNKpE/CnWDtba+uRDkKYlipzZ97jAixCAABoA/UAAEDpaKkyzTS0SKeQCaMs7FPra0inAACQIagHAADKpbCpfklqjJTfNDMjpeJZ6Ts/XSQS3bsT1RmL5Zx1DYeY23T4ywIAyC/UAwAA5XKvusxX31Sa73ji8G+fTh6mpq7xzq8QsXTWji0bOmkNcSKnspzX2BmvDAAgj1APAACUyzhLp/GWLGm+Y1pqko2NvZGRyds+USQS0TfSU5O8fQM6IRpFUVQFr+lG5ZNOenEAALmDegAAoFwYlKSTLsMLBIKft20eM6TbwCDW3Bnjc7PTKYqaNmnItUtnnjwpDPK2GNzLhZ4gJJFITh09MOXD/n272Qzu5TJ3xvjMjBSKok4dPRDkbREXezM8bGQff+u7d25wG+qCvC0qK8rOnT4c5G2xbOH0Do/tpmekzlDt8JcFAJBTTNIBAABAengi4bTEGz/7DeqMF/9l5/dH/9jz+fzlhobGl84dMzW3oChqwZdrFs76KPSTWQNDRmlqadHN5PuNKy6cOTJtxgJv38DHyQ8O/LqtorzU3cO3oCBHVVV1764fZi9YIRC2dAvoqaLKnPtFxM/bNq/btMPS2tbUzKLDY3fR1GYp/R6vAACtUA8AAJRIo0igx1TvpBe/f/emq7vPpzMXURQ1duJU+k41dXWKovoPHuHXLYi+53b0lb9O/LF6/ZYPxoVSFMVtbKAoys3dh6KogrwcDU2t77b81sXCqvVlRUKhmprakBFj1dTUOiO2UCK++qz4Q0uHznhxAAC5g8lFAABKxERda7vvgE56cWdXj/TUpF92fs/n81rvzMpIoSjK1d2r9Z4Dv26zsXMcPXbK8wekpxgZmXTpakXXg4HBI9p2A4qisjIes1w8OqkbUBQlEUtuV5V00osDAMgd1AMAACUikkgq+U2d9OLLIjZPmPLp779um/RBn7THifSdWRmPbe1ZOjrPj2Crqa7MTEseNnJc6/qHrKxUVw9viqLqOLU11ZUeXv7tXjYrI8XNw6eTMlMUJaaoQWbWnff6AADyBfUAAECJqDIY32Q+aBGLOuPFdXT0lq/6bt/hSw31ddt/+oa+Myv9savb/4YOSooLKYqytLKlv2xubkpLfujq7k1RVH5eNkVRjizXtq9ZU11Z8azMzc27MwLTNFRVR1jYd97rAwDIF9QDAADl4qxrWM7r+AGEFkELfcPDy5/l5CpoEdB3FhWyzcy7tj6MniPUegbCudOH+Hxely5WFEUV5uVQFOXAcmn7snm5mRRFmXbp+BXJrXK4nJuVmFwEAPAcliYDACiXCNfuNS28Dj9/+MAvW1KSE4YM//BJYX5qSuKyiM0URakx1bS0daKun2c5u9XVc8KmzbFzdNY3MDx97ICTs1tGWvLurZspimpubqQoKj8vx9DI2NjErO3L6urqUxR15OAebn29iqrqsJHjOjo4lcypdNcz6vCXBQCQUxg9AABQLjyRsFbA7/CX1dHTf1b6ZMt3a+7FRC1f9d2k0BkURTEYjEVL1zU2Nn6/YcWtvy9RFKWtrbPh+z2c2poZH488dujXOYtWmpia52SnUxRVkJ/j4OjS7mXdvfw+GBealpL4/cYVOVlpHR6boig7bf3eJl3f4IEAAEqBQZ9QAwAAyuPL1Jg59t4mGpqkg8gEAzUNZuecEwcAII9QDwAAlE58TXkpr6m/qeXLHrB6+Zx7MdEv3t/Fouuz8rIX7zc0NDpzJb6jY/4LbkPdmKGB//pbhsbGnJqaF+8fMHjYuk07XvaCp56yQ8xsWLo4Fg0A4DnUAwAAZcQTixqFgpf9bk11Ja+5+cX7BQLBv54/oKqqSh9c0NnEYnF56b8vIxYIW9T+7cQ3LW1tI2PTf31KZn3Nzaqna9z+vW8AACgn1AMAAGVU0FifyKlQ8v3+JRLKRF2DgZlFAABtYGkyAIAyctDRrxO03K56SjoIMaW8xhaJEN0AAKAdjB4AACivBkELR8DXZv7LfCHFdqfqaYNAMM3OjXQQAACZg3oAAKDU0uurK/jNPgb/PjtfIdULWigG5aCtTzoIAIAswuQiAACl5qlvklpXVcZrJB1ESv4szrLV1kM3AAB4GdQDAABlt8jJT19NXVNFNaWuknSWTsRgML7OjO9pbKGK9QYAAC+HyUUAAPDcpqwEisGY4+AllkhUFOUzdLNIeLe6TFNV9YOujhKJWJWB62IAAK+i+vXXX5POAAAAMqG/qZWDjp6ZhnZGQ82hJ9l1ghaWrkE1vzm3kcMTiw3VNKpbeBkNtS0yf7uM13i/prymhWevox9TXSoUi0d0sdNQUVWYzgMA0HlwEQUAAP7HRktPlcHoYdQlzNrVUUffSF1DKJFk1tcWNtXrq6k3iASxVU875PatZ0WRB/d17Gs+v81UbxIJJRLKQVtfn6k+3pIVauOio3y7MwEAvBtMLgIAAAI4HM6ECROioqJIBwEAgH/A6AEAAAAAADyHegAAAAAAAM+hHgAAAAEMBsPV1ZV0CgAAaA/1AAAACJBIJNnZ2aRTAABAe6gHAABAAIPBMDAwIJ0CAADaQz0AAAACJBJJXV0d6RQAANAe6gEAABDAYDAsLS1JpwAAgPZQDwAAgACJRFJaWko6BQAAtId6AAAABDAYDC8vL9IpAACgPdQDAAAgQCKRpKWlkU4BAADtoR4AAAAAAMBzqAcAAEAAg8EwNTUlnQIAANpDPQAAAAIkEklVVRXpFAAA0B7qAQAAEMBgMMzMzEinAACA9lAPAACAAIlEUllZSToFAAC0h3oAAAAAAADPoR4AAAABDAaDxWKRTgEAAO2hHgAAAAESiSQvL490CgAAaA/1AAAAAAAAnkM9AAAAAhgMhru7O+kUAADQHuoBAAAQIJFIMjMzSacAAID2UA8AAAAAAOA51AMAACDD09OTdAQAAGgP9QAAAMhIT08nHQEAANpDPQAAAAAAgOdQDwAAgAAGg+Hq6ko6BQAAtId6AAAABEgkkuzsbNIpAACgPdQDAAAAAAB4DvUAAAAIYDAYBgYGpFMAAEB7qAcAAECARCKpq6sjnQIAANpDPQAAAAIYDIazszPpFAAA0B7qAQAAECCRSHJzc0mnAACA9lAPAAAAAADgOdQDAAAggMFgdOnShXQKAABoD/UAAAAIkEgkz549I50CAADaQz0AAAACGAyGp6cn6RQAANAe6gEAABAgkUjS09NJpwAAgPZQDwAAgACMHgAAyCbUAwAAIACjBwAAsgn1AAAACGAwGNbW1qRTAABAewyJREI6AwAAKIuZM2eWlpYymUyxWFxbW2tsbMxgMAQCwZUrV0hHAwAACqMHAAAgVZMnT25oaCgtLS0vL+fz+WVlZaWlpaqqqqRzAQDAc6gHAAAgPUOGDGGxWO3u9PPzIxQHAADaQz0AAACpCg0N1dbWbv2yS5cuYWFhRBMBAMD/oB4AAIBUDRs2zM7Ojr4tkUgCAgLc3d1JhwIAgOdQDwAAQNqmTZumo6NDUZSFhUVoaCjpOAAA8D+oBwAAIG1DhgyhBxD8/f0xdAAAIFOYpAMAAEDnqm3hFTbVN4vFpIP8Q6/Pwsr/+st38rh7NeWks/yDriqTpWOgw1QjHQQAgAycewAAoLA4LfxIdlJWQ623vkmdoIV0HPnAZKjkcGt7GHdZ5RpIOgsAAAGoBwAAiqma37w07e7Yro4Wmtpv8HD4h8z62ge15dt8+6ur4EwGAFAuqAcAAIppTNzFpU7+Gjhx7F0VN9XfqS7b6TuAdBAAAKnC0mQAAAV0pCQnxNwW3eB92Grrd9HQul31lHQQAACpQj0AAFBAqXVVBmpYXPu+NFWYOVwO6RQAAFKFegAAoIBaxGIjNS3SKeSeqYYmpwVLugFAuaAeAAAoII6AL6FkaydTeSSUSJrFQtIpAACkCvUAAAAAAACeQz0AAAAAAIDnUA8AAAAAAOA51AMAAAAAAHgO9QAAAAAAAJ5DPQAAAAAAgOdQDwAAAAAA4DnUAwAAAAAAeA71AAAAAAAAnkM9AAAAAACA51APAAAAAADgOSbpAAAAIBMKczKO7fwxLyNVKBRY2jmOCpvRe+ho0qGe+3bhjOqK0g37T2rp6JHOAgCg4DB6AAAAVGVpyaZ509MS7mloaapraBTlZBTlZJIO9ZxIJMrLSCkvLqzncEhnAQBQfBg9AAAA6t6Ni82NDT0GD1+4cQuDwchMemBgbEo61HOqqqpr9xzm1td1sbIhnQUAQPGhHgAAAMVvaqIoysjMnMFgUBTl3q0HfX9hTsbq6eOdfbqt++UIfc+azyYUZKV/f/SSlT3rqykjyooKAvqHZCY9EIuFjh4+E2d94eLtTz9SKBRe+GPv7YunOVUVxmYW/UaN+2DaLCaTSb+mlaOzvYtb8r07Lc3NX3y3Y9eapc1N3C2n/zaztKZHM5ZMCNEzMt5+9mb4IH+xWExR1C/X43X0DCrLnv65ZVNmUgJDRcXRzfOTL1dZ2bMoikq+d/v0bztK2DnqWlrePfqELvzKxLwrRVFbVsxPvBM1ZEJYRuL9Z0+LP5w+Z1z4fHLfaQAAWYfJRQAAQHkF9qYo6sapw9dO/EF/Fn9zJQW5Af2Du9o5ZDy8/+386SX5uRRFSSSSHasWn/51O5/XzPL0bWpsOP3r9l82rGx91tP83NT7sQH9Q3x69fft1b/viA8pioq9ep7+3ZvnTlAUNXjsZDV1jW79gplqaq1P/Pmb5Ukx0Ra2ti7efgXZ6Vo6uhRFPbx9/adlc4pyMp19/PWNjO//fXnDnKlN3PrWZ904fdjIrEu3fsFeQX064hsGAKCwMHoAAACUR/eeY2fMO7t/959bNt+6cDps0Qq6MLyJlVv30Zf893+3NvrciSvHDn4esTHxTlTinb/tXDzW7jmkoaXd1MhdO2Ni3PWLo8Jm0M9SUVGJ2PWHtaMz/WXIxI9vnD4cc/mvcTPmCYXCO5f/UmUyQ8aFUhS1+Lsdc4b35NY9X3jwhJ1DUdQXm7ebWljxmpo0tbUpijq8/QeJRDL/68ieISNFItFPy2Y/vh8bdeb4B9M+p5/VM2TEgg1bOuE7BwCgaDB6AAAAFEVREz9ftOH30x4BQU/Y2d8tmhF99vgbPlGFqUrfoEcA2OkpFEUlxURTFKWprX361x1Htn9/dt8uDQ0tiqLyM1LpB1s5Ord2A4qirOxZHt17VTx9kpOSmHz3JqeqMnDQMCMz8xffzr/vQIqiIpfMunftgpqGBkVRz54UVZaW6BsaBQWPoNcq9Bs5jqKorJSE1mf1DBnxft8eAABlgdEDAAB4zsHVM2LnweunDv/x04Y/t37bfcCQt3q6roERRVHNDQ0URXGqKyiKyk5+mJ38sO1j1NQ16Rua2jrtnj504scZD+NirpyrqSijKGr45Gn/+i4zV67X0tG5ee7k7q+/Onvg52X/+aW+rpaiKH0TM3rhBEVReoZGFEU11tW1PktTW/et/iwAAEoL9QAAAP5h6MSwmEtnCrLSC7LSjUzNKIqi3mw1QnVFGUVRWnp6FEVp6+pRFPXZ8m+Cx01u97DCnIx/fbp/38EmXbrG3bjIb252dPd28vT914epa2p99tXXIz8O3//duvSH9w5t/TZs4QqKouprq1sfU1tZSVGUrqHRW/yxAQCAojC5CAAAKIqiWnjNx3/+T03lM4qiairL6Rt6Bkb6RiYURZUWFzQ1cimKKshOLy3Kb/dcIb+FoiheU9OlQ/soinL3705RlJtfD4qirh0/WF9bQz8sJyXx1RlUVVUHj5vCa2qSSCTDPvrkZQ+rqXzWwmvuYmUzZf6XFEWVFReYW9uamHetr6lOvBNFUZRA0BJ97jhFUZ4BPTviewMAoFwwegAAAFR89NULf+y98MdeLR09XhNXIpF4dO/l4O7JYDCsWS4leTnLp4w0NutSmJ3+4r5Ga8I/6mJt8+xJcRO3XlvPYNTH4RRF9Rv54Y1Th54W5n05McTawbm+tqai9MmG3087uHq+IsagMZPO7tulra9PryL4Vyd+/k/qg7tOnr50UXHv1oPBYEyas3jP+hU7Vi928vKrKi+tKnvaxdp24JhJHfcdAgBQFhg9AAAAyt7FIyh4uKGpuUgksLRznDRn8dIfdtFT+Rds+I+rX/fmxoa6mqrRU2daOzi1e24XK+uSfDZFUQH9Q9btPULvYqShpb3q5z8HffiRuqZWfmYqj9fUM2Skjp7+q2PoGxkHhQwPHjel7U6m7VjasZhq6o/u3mpubBwyIezjBSvoVdELNmyxsndipyU3cbm9h32wavefWjrtlzcAAMBrMSQSCekMAADQwcKTosZ0deiiod2p70Ifi7bt3E36ADLFk1pfXcZrWusWSH9ZXV0dHx9/48aNwsJCkUh0/vx50gEBADoeJhcBACgUHo+Xnp7O4/NIB1EclZWVcXFxN27cKC4urqioEAgEYrE4KSmJdC4AgE6BegAAIJfEYnF+fj6Xy/Xz8ysoKNiyZYuFhUVERERycvL+/fsFYSNJB1QQSUlJ4cc2VFRU8Pl8VdXnJzy07qAKAKB4UA8AAGRdRUVFTU2Nm5tbZWXltm3bNDU1V69enZaWtmnTpidPnlhYWJibm9vY2PTp04eiqJ49e/bs2TM8KUoKwSKPXZHCu5BlaGjAUVERCoWt3YB2+PBhGxsbW1tbOzs7tAUAUCRYewAAICv4fP6zZ89sbW25XO7PP/8sEAgiIiLy8/Pnz5/fq1evtWvXVlVVJSQkODs7Ozk9Xx/8wQcflJWVicVidXV1c3NzXV1dLy+vgICAE2ZMKaw9UHj02oOJYp3Nmzez2WyhUNj6Wx9//PGTJ0+Ki4uLioqsrKzs7OxsbGzs7OzoG5aWliRzAwC8B9QDAAACampqjI2NhULh/v37a2trV6xYUVtbO2rUKB8fnz179nA4nKtXrzo7OwcEBEgkkldfnO7evXvbL8Visb6+vtay8DkBfVAP3lNqfTW7uvL7gIEURf30009XrlzhcDgURenq6t66dav1YSUlJUVFRU+ePCkqKqJvVFRUtBYG2/8yNTUl+YcBAHgzmFwEANCJ6EkpDAbj5MmTJSUlS5YsEQgEAwcONDU1PXfunFAolEgk3bp1oyjKwMDg3r179LMMDQ2nTJlC335ZN3j69Gl2dnZWVla7/qChoeHl5dVkqZhbCUlfXl5ej7nLr127tnTpUjU1taioqOLiYp1/bplqbW1tbW3d9h6hUNhaGFJTUy9dulRcXNzU1ERPRmpbG/T1X7PZKwCAlKEeAAB0DPpjenR0dEFBQVhYmKam5oQJE0pKSmJjY9XU1IqKiuhPkEwmMyoqSlNTk6IoTU3N2bNn009XUXnNQTSFhYVZbejr67u6urq5ubV9jJGR0fjx4+fOnSudtQfKwMfH53hcHH0YXGNjI4PBCAkJyc3N3b17t6+vL73e40VMJpPFYrFYrLZ3NjU10ZORnjx5EhcXd/z48eLiYhUVFbontB1noP96AAAQgclFAADvIjk5OTc3d9iwYfr6+vPmzUtNTT1//ryRkdHmzZsNDQ0///xzNTW1yspKMzOzd36LnJwcugnQowQWFhZubejp6dEPCw4OrquroyjK1tZ2yZIl/fr1k9q5Bwqv3bkHrSQSyf79+/Pz8zdt2lRbW7thw4bevXtPnDhRLBa/tua1w+FwiouL6dpQ/F/6+vovdoa3fWUAgHeD0QMAgFcpKCjIzc3t1q2bqanpxo0b79+/v23bNhaLdeHCBTU1NfoT2/Lly83NzbW1tSmKioiIaH3uW3UDoVBIN4HMzEz6hpOTE90Ehg4d6ubm9rIrynp6ek1NTX5+fps2bTI2Nu6IPzS8BoPBCA8Pp28bGBh8+OGHJSUlFEXl5+evWLFi+PDhn3/+OZfL1dLSarff0YsMDQ0NDQ19fHza3llRUdHaGRITE+nblpaWrVWBrg1YAA0AnQGjBwAAFL1WmP5EbmZmtmfPnpiYmJUrV3p7e69fv765uXnJkiXm5uYZGRlGRkZdu3bMtP6mpqbWmULZ2dmFhYVubm6urq7u7u70jTe/Wrxly5YlS5a0vQejBx3iZaMHr1BYWFhaWtq7d++MjIxPP/00NDR0yZIlJSUlLS0tjo6O7xOmpKSkdXiBHmqoqKiwtbVtu2OSnZ0dFkADwHtCPQAA5cLj8fLy8jQ0NJycnGJjY//888/hw4ePGzdu+/btOTk5y5Yts7e3T0pK0tbWdnZ2fu2l37fC4XDoaUKZmZnZ2dlVVVWtM4VcXV1b9yrtEKgHHeId6kE7JSUl1tbWjx8/3rBhQ0BAwMqVK5OTkysrK3v27Nk6PeydCYXC4uLitjsmFRUV0QugW2uDg4ODlZWVgYHBe74XACgP1AMAUFglJSUMBsPKyio6Ovrq1av9+vX74IMPDh8+fO3atbCwsGHDhmVnZzc0NLi7u7fbiKajVFRUtM4Xys7O5vF49GJid3d3V1dXW1vbznjTpqam33777aaHxaee3VEP3tP714O2BAKBmppaVlbW77//7u3tHRYWdunSpadPn37wwQcdNSTVugC6tTY0NDSkpKQwGIzWQ9xaZyhpaWl11JsCgCJBPQAAucflcrlcroWFBZvNPn78uL29fVhY2NGjR48dOzZr1qxRo0bFx8dzuVw/Pz8TE5NOTVJSUtJaBrKysphMZut8IVdXVwsLi85769TU1ISEhBkzZuTk5MTFxd306DrO2gn14D11bD14UXFx8ZUrV1xdXQcOHLhjx47S0tLZs2fb29vTRaID34jD4bQe4tY6Q0lXV5fuCSwWq3VhQ8eOmAGAPEI9AAB5wuFwDA0NCwsLL1++7OjoOHz48IsXL0ZGRk6ZMmXu3LmZmZmZmZndunWzt7eXTp78/PzWxQMqKiqlpaWtZcDNzU0KC4UTExM9PT1VVFRmzZo1duzYsWPH0vevy4z3NzSzQD14P+n1NS1i0VxHbym8F4fDefDggZ2dnaur68qVKysrKzdu3Ni1a1c2m92xE89aVVZW0j2hvLw8NzeXvt21a9fW1c/0UEMHjmwAgFxAPQAAWSQUCplMZl1d3eXLl1VVVT/66KPExMR58+Z99NFHS5cuTUhIePz4cd++fV1dXVtaWtTV1aUWLCsrKycnJz09nR4fsLGxabvZaCdNUnpRfX29vr7+p59+qqGhsXPnzhevNH+fnainptHNEKtU38vVZ0Xe+qYTrFhv8NgOxmazTU1NDQ0NFy1aFB8fHxUVpaure/PmTRaL1UnT0mitC6Dp9QzFxcVVVVV0T/Dw8DAxMaGbg5GRUedlAACyUA8AgDwulxsbG8tkMkNCQtLS0pYuXcpisXbv3s1ms8+ePRsYGDhgwIDm5mY1NTUmU6rbMdObjdKLiekhAldXVx8fHxaLRY8PdOwMkDfx119/bd269ZdffnFzc2toaHjZ8tbYmrKoiuJh5nZSjqdgTpTkLmL52usQPthYKBTSR61999138fHxR44cUVNT27t3r7+/f69evTr73QUCAd0TKisrs7Ky6OYgFArpztC6mMHOzg6LGQAUA+oBAEiPSCR69OhRXV1dcHBwWVnZsmXLDA0Nd+3alZaWdvTo0f79+w8bNqyhoYHP55PanJHH42VlZRUVFSUlJWVlZdGbjdKLiekbRFKVlJT8/PPPXl5eoaGhjx49cnZ21tXVfe2ztrCTGwUtIV068UqzYjv9lN3LpOt4SwJDB68lFosPHDhQWFi4YcOG2trab7/9NjAwcNKkSe9wLtu7aWhooDtD62KGoqIiXV1dui24u7ubmprS05OkEAYAOhbqAQB0isLCwurq6oCAgLq6ujVr1ggEgp9//vnZs2dr16718/ObO3duQ0PD06dP7e3tX3bal3RwudzWzUYzMzPLy8vd3Nx69uzZtWtXNze3Tprz/YZu3LhRVVUVGhoaGxvL5XKHDBnytstGt+el1LbwumrqWGnpqjJw5u5rcLlcisEQMajylubsxrrRXe1HdnUgHer1xGLxzZs3CwoKZs6c+fjx440bN4aGho4bN66mpkbKx+TRixmKiorq6+tTUlLo5tA6wmBvb0/fwMkMADIO9QAA3ktzc3NxcTGLxWIymZs2baqtrf3xxx/z8vKWL1/u7++/evXqxsbG5ORkOzs7a2tr0mEpiqLq6uqysrKePHmSkJCQlZXF4XBaNxdyd3eX2prml+HxeHfv3g0ODk5MTDx58uSEp9aWAAAgAElEQVTUqVO9vLze5wXvVD29XVXKFbUUNTV0XMy3xmvmMZlMpppU54a9ldraWrFIzGAwqNp6BqdekpCmUVmroqIiFAovXbpEOt1byMvLq6+v9/f3v3DhwqZNm9avXz906NDMzEwDAwMipyy3rmEoLCykb/B4vLaFwd7e3sbGhuxlAgBoC/UAAN4Uh8NhMpm6urrnzp17+PDh7Nmzra2tJ0+erKqqun//fk1NzTNnzjg6Ovr5+ZFO+g8cDqd1cCAzM7OxsdHNzS0oKMjKysrNzU1GSguXy21qajI3N580aVKvXr2+/PJL0ok62JAhQ44fPy7li9lvJT4+/uuvv66srGx3v0QiSUxMJBTqfQkEgpqami5dupw9e3bfvn3z5s0bMWJEbGysmppat27dpL9yhsblctsWBlVV1Vu3bhkaGtKTkejOYGdnhx2TAEhBPQCAf8Hn8zU0NBITEx88eDBw4EB3d/fly5cnJibu2rXLzc3t7Nmz6urqgwcPls0LfhwOJycnJy0tje4Dzc3N9OIBGpELqC9Db29/8ODB/fv379u3j+xcps7T2NiYnZ3drVs30kFeY9u2bUePHqXXAdPkuhu8iP7/Oioq6vTp08OHDx8zZszVq1cbGxsHDx5MfCei8vJyepyB7gxFRUXV1dV2/9XaGd5k1Q0AvCfUAwCg2Gx2enq6j4+Pg4PDf/7zn9OnT3///fd9+/Y9duwYl8sdM2aMubl5c3OzzG5L0tDQkJGRkZmZmZGRkZGR0dzcHBISoq+vT/cB2bwGGR8fv2/fvsmTJwcHB+fl5bFYsrj+VQl99tlnqamprV9qaGjcvXuXaKLOlZ6efu7cOS8vrzFjxuzbt6+xsXHq1KkyMsjT0tJS9F+tnUFTU7NPnz6qqqr2/2VlZUU6KYCiQT0AUCLV1dUFBQUODg4mJiZ79uyJjo5eunRpUFDQjh07amtrP/30U1tb2/LyckNDQ9kcFmjV2NjYWgYyMzPr6uo8PDzc3d09PDw8PDxksw/QH3fOnDnDYDAmT5589epVMzOzgIAA0qGkYevWrSEhIe+5iEI6iouLFyxYUFpaSg8djB8/Pi4ubuHChcOGDSMdrdMVFhbevn27T58+Tk5OCxcu1NbWXrNmja6urtR2Q3oTVVVVT58+zcvLK/yvZ8+e2f+To6OjhoYG6aQAcgz1AEAxlZeXa2pqGhoanjlzJjY2du7cuc7OzosXL25ubl69erWNjU1KSoqurq6Dg4Ps/OB/BaFQmJ6enpGRQf+qp6enqalJlwF3d3cZWT/wMpWVlQ8fPhwxYsSDBw9u374dGhoq44E7XFBQ0N27d6V8ZsU7O3PmzJYtW5qbm42Nja9fv15WVrZjx468vLz58+f379+fdDopqa6ufvToUY8ePfT19YcOHWphYbF//34mk1lWViZr9VsoFBb+U35+vrGxcWtbYLFYNjY22C4J4M2hHgDIPR6Pp6mpmZKScuvWrd69ewcGBq5ZsyYpKen777/38vKKiopiMpk9e/aUu8tpWVlZrX0gPz/f09PTw8OD/lUuNlOvra01MjKqqqqaOnVqWFjYJ598QjoRGTwer7q6Wr5mgKxYsSIqKurhw4et97DZ7F27dtXW1i5YsKB79+5E0xGQnp7u5uamqqo6evRosVh8+fJlkUj0+PFjX19f2by+UFZWRk9JKiwsbGxsfPDgAY/Hax1boG/IxT8jAESgHgDImfLy8oSEBCsrq27duh0+fHj37t0rV6784IMPLl++XFVVRV/nE4lEb7tBviwoLCxMT08vKyuLiYnJzMx0dnZu7QMuLi6k070poVDIZDJnz55dV1d37NixlpYWdXV10qFI4nK5DAZDR0eHdJC3M3HixFOnTrW7MzU1defOnaqqqgsWLPDw8CAUjbDq6moTExOhUDhnzpysrKzY2FihUBgVFdWtWzczMzPS6V6Ky+W2ji3QN4qLix0cHOzt7dv+KrMrrACkCfUAQEbx+fzc3Fx1dXUXF5dbt27t27cvJCRk+vTpZ8+eTU5OHj9+vI+PT1VVla6uroyvE3iFysrK9PT0tLQ0epTA1NTU09MzMDDQwcHB3d1d7hrOo0ePfvnlly+++MLd3T05OVnWNnglJTw8fOHChYr03YiPjz9w4ICent6CBQtwBZo+DX3NmjVPnz49ePBgeXl5VFRUr169HB0dSed6DYlEUlBQUFhY2PZXAwMDe3t7f39/IyMjBwcHR0dHQ0ND0kkBpA31AIC8lpaW+vp6U1PTjIyMEydOuLu7T548+cSJE5cuXQoLCxs6dGh+fj6Px2OxWHI3QaidpqamtLS09P+SSCSenp5eXl70KIGcbll45coVgUAwZsyYS5cumZubBwYGkk4kQwQCwZw5c/bt20c6SMeLjo7euXOnj4/PokWLZGSrH1nA5XL37t3b3Ny8atWqlJSU27dvDx061M3NjXSuN1VeXk4vd05PTy8oKMjPz6coytHRka4KDg4ODg4O5ubmpGMCdC7UAwACysvLb926ZWJiMmTIkEuXLm3cuHHOnDnTp09PSUkpLi7u3r27rC3+ex9ZWVnp6empqalpaWn0xCfP/5Lrn7IPHz7s3r377du3b9y4ER4e7uDgQDoREHDhwoVz5855eHgsWrRIXtZeS019ff1ff/2loqLyySef/P3337GxsRMnTpSLDaza4nA4+fn5dFUoKCgoKChobGyke0JrYVC2zQZA4aEeAHQiPp+fkZFBUZS/v398fHxkZGT37t1Xrlx59+7de/fuhYSE+Pv7NzU1aWtrk07akcrKytLS0lJTU+mJQ05OTp6ent7e3l5eXgrwGVoikdDnKkycOFHxzjbucHw+n8FgKPzqi8OHD2/fvn327NkzZswgnUVGcbncmzdvamhoDB069PDhww8fPvzss898fHxI53oXjY2NdE9oLQzPnj3r37+/urq643/Z2tqSjgnw7lAPADpGS0tLTU2NhYVFQUHBgQMHLC0t58yZc/369RMnTowZM2bMmDFlZWX01hkMBoN02A7G4/FSU1Pz8/Pv37+flpamoaHh5eXl7e1NTxxSmEuqe/fuPXXq1NWrV4VCoUQikfeJXtIxderUVatWubu7kw4iDbt27Tpx4sSiRYsmTJhAOotM4/F4Dx480NbW7t69+3/+85/MzMwvvvjCy8uL3oSNdLp3IRAICgsL2Wx2/n+VlpbSYwssFou+YW9vTzomwJtCPQB4R/X19devX1dVVR03btz9+/eXLFkyadKkL7/8Mjc3Nycnx8/PT752cnxbRUVFqampjx8/Tk1NLS4u9vb27tu3r62trZeXlyLNw66trT179mxISIiNjc3p06cHDhxoYmJCOpTc4PF4ixcv3rNnD+kg0sPlcrdv3/7kyZOwsLC+ffuSjiMHJBLJo0ePtLW13dzcNm3a9Pjx4/Xr17u6utbV1RkYGJBO9+6EQiE9tpCXl0ffePLkSWtboKEwgMxCPQB4PTabXVNT06NHjydPnkRERBgZGW3fvj0rK+uvv/7q3bv3gAED5Pei15ujhwgeP35MVwJDQ0Nvb28fHx9vb2852nX0DQmFwqqqKgsLi6+++srOzm7WrFkKPz0GOlBJSUlkZGRLS8vSpUudnJxIx5EnbDZbU1PT2tp69erVqampO3bsoI9yt7CwIB3tfYlEota2QBOLxRoaGiwWy8nJicVisVgsxb6oBHIE9QDgHzgcjqGhIZ/P37VrV0NDw7p164qKipYvX96rV6/FixdzOBx6yFjhywCtpKTk8ePHKSkpdXV1MTExdB+gK4FcX9h7tVOnTkVGRh46dMjZ2Zl0FvmWm5uroaGhtJOwHzx48NNPP3l6ei5dulTujn2QBSUlJdra2sbGxl9++WV6evqRI0dMTEwKCgoUYAkTTSKRsNnsvLw8+te8vLyamhq6J7QWBhz2DESgHoCyu3v3bmFhYWhoqIqKSnBwsLa29oULF5qbm8+cOePm5hYQEEA6oLSlpKTQleDx48daWlo+Pj6+vr7+/v4sFot0tE5UV1d38OBBa2vr8ePHP378WE5XTMqa6dOnf/XVV3K3U03HOnfu3E8//TR79uywsDDSWeRYVVWVtra2trb2559/npubGx0dLRaL2Wy2HG2Z+iaam5vpntBaGIRCYWthcHZ2dnR0lNMNoEG+oB6AshCJREVFRY6OjkKhcO3atfQJPiKRaMmSJfb29kuWLGEwGPTQAemk0lZdXZ2SkkL3gdTUVHp8wNfX18fHR+Hn2QuFwpSUlICAgLNnz9bV1U2ePFlJxoWkgD4qa/PmzaSDyIQDBw6cOnUqIiKiT58+pLPIvYaGBl1dXaFQ+Omnn9KXcmpra8vLyxVyBTyHw2ktDJWVlUlJSbq6uk5OTk5OTi4uLnRtIJ0RFBDqASgskUh0/fr10tLS8PDwxsbGQYMGeXt779u3TyAQREdHK/m/qsXFxY8ePUpOTn706JGNjY2mpibdB5TqqjmbzQ4LC1uzZs3o0aNJZwHFV15evnnzZiaTGRERgRkjHYVe91VZWblkyRJjY+Pt27eXl5dzOBwFG1Voq7y8nM1ms9nsnJwcujnQbcHZ2Zm+0aVLF9IZQe6hHoAi4PF4eXl5Hh4eEolk8eLFZWVlJ0+ebGpq2rx5s7u7e1hYmFgsVlFRIR2TsKysrEePHtGtQEdHx9/f38/Pz9/f38bGhnQ06ZFIJH/88UdiYuL27dsrKyvNzMxIJ1JYZWVlEonE0tKSdBDZcvv27c2bN48bN27OnDmksygauioUFxdHRER07do1MjKypKSkublZ4RcR5ebmstls+lc2m93U1NRaFVxdXZ2cnDAoCm8L9QDkD/1Z/++//378+PHMmTP19fWHDh1qYWHx+++/MxiMe/fusVgsBdjmokOkpKQ8fPiQbgX29vb+/v50K1D4WUMvio+P9/X1FQqF+/fv/+ijj/A3pLMtXLgwNDS0d+/epIPIol9//TU9Pf3jjz/u0aMH6SyKia4KbDZ79erVPj4+ERER+fn56urqynC8cUNDQ2tVqKmpiYuLMzc3d3Z2dnV1dXFxcXZ2xr9+8FqoByDrhEJhTk6OoaGhpaXlrl27Ll++/NNPP7m5ue3cudPIyOijjz5SU1MjnVGGiESipKSkxMTEpKSkpKSk0aNHm5ub061AOS8gCYVCJpP55Zdf8ni87du3K8wZbbLv22+/XbJkiXL+rXsTNTU1q1atsrCwWLduHeksCq65uVlLSysxMXH9+vUTJ0785JNPsrKyzMzMlOcqSXFxcW5ubnZ2dk5OTm5ublNTk8s/Kd5hnfCeUA9AFuXl5V27ds3JyWno0KHbt29PSEhYsWKFl5dXenq6iYkJrny009LS0loJUlNTu3XrFhAQQP9KOhpJ5eXl27dvDwkJGTx4MOYRgWw6f/785s2bN23aFBwcTDqLUuByubq6uteuXfvpp5/Wrl3bt2/ftLQ0Nzc3pbpwUF9fn/NPLBbLxcWFHl5wc3PT19cnnREIQz0Awp4+fdrS0uLg4BAVFbV3797g4OBZs2ZFRUUVFhYGBwfjUMmXEQqFCQkJ2dnZt27dys7Obq0Efn5+pKORd/fu3T59+ly6dInJZA4bNox0HGVUWFhYVlbWq1cv0kHkgEAgWLVqlUgk2rhxo5aWFuk4SqShoUFPT2/Hjh2HDh26ePGimZlZVlaWAq9pfgV6oTM9vJCVlaWjo+PWBlbSKyHUA5C22tra69ev6+rqjho16sSJE4cOHZo9e/aoUaPYbDZFUcq8m9CbePToUUJCwoMHD1JTUwMDA/v37+/u7u7t7U06l0wQiUTNzc0DBw5ctWrVuHHjSMdRapGRkTY2NlOmTCEdRG7cunXr0KFDkyZNQqElgl6rQB+qcPXqVU1NzZqaGmNjY9K5yCgrK8tqQyKRuLm5ubq60r8qw/oNQD2AzlVeXm5hYVFSUvLDDz/o6Oh8++23SUlJf//999ChQ/38/Oh54aQzyrrMzMwHDx4kJCQkJCR4e3sHBgb26NHD39//nV9QLBbz+fwOzdjx1NTU3vzvBpvN3rVr1+rVq7W0tLS0tDCPlrhTp0717dsX8wDfVkREhIqKysaNG0kHUV4NDQ1aWlpMJnPChAlqamqHDh3CD6nq6uqsrKzs7Gz6V3rfWFdXVy8vL1dXVzs7O9IBoeOhHkBHEolEycnJHA4nODg4LS0tPDx81KhRa9euLS8vz8vL8/T0VMJDx95NSUlJfHx8QUHBhQsXbGxsevToERgYGBgY2CE/qPh8fl1dXUfE7ET6+vpvsqq1sLDQ3t7+559/9vT07N+/v1SiAXSiK1eufPfdd9u2bcNEQeJyc3MdHR0piurfv39ISMg333yDS1r0+g26KuTn5z969Ki6utrd3d3Dw8PT09PDw6Nr166kA0IHQD2A99XS0nLgwIG6urrly5dnZGRs3bo1ODh48uTJjY2NGhoa+Jf0zbW0tMTFxcXFxd2/f18ikQQFBQ0YMMDX11dXV7dj30gx6kFGRsbChQu/++67wMBAKeaC1+NwOLGxsThs7p1xudwvvvjCz89v4cKFpLMARU89SkxM7NOnT2Fh4f/93/9NnTp11KhRpEPJCi6Xm5mZmZGRkZ6enpGRwePxPNrAugU5hXoAb4e+WEtR1IIFC3Jycq5fv97Q0HD06FE/Pz9s4P1uMjIy7t+/HxcXl5aW1qtXr169evXs2bNTjyqT63pQV1cXFRU1fvz45ORke3t7jEfJoMuXL8fFxW3YsIF0EPn2+++/p6enb9iwAZvDypScnJyCgoJhw4adPXs2OTl55syZmIvfVm1tbUYbKioqHh4ePXr0cHR09Pb2xl9meYF6AK+Rl5eXnp4+evRoFRWV3r17d+3a9fTp0xRFJSQkuLi4GBgYkA4ol5qammJiYmJjY0tLS1taWnr27NmrV69u3bpJ593ltB4IhUKxWDx8+PCIiIiQkBBy0eA17t69q6WlJbW/zwqsoKBg6tSpu3btwkQjGdTS0nLt2jVjY+M+ffps27bN2tr6ww8/xIB5OxUVFRkZGcXFxffu3UtNTbW2tvb29vb29vby8mKxWKTTwUuhHsC/iI2NvXfv3owZM0xNTcPDw21tbdeuXctgMPh8voaGBul0ciw/P59uBVlZWf369evbt2+fPn2kX7FerAe7d+++ePFiu4d9/PHHU6dOXbRokbq6+o8//kg/zNHRcfjw4a2PycvLW7hw4bp164KCgpqbmydMmDB58uTp06fTmywdPHiwsLBQTU3NxcVl2bJlRkZGMTEx3377bWRkpKenZ+uLLFu2rKWlZfv27W3fvW094HK5W7dunTx5soODA376grIJDw8fMmQItoGSZenp6efOnRs/frybm9tff/3VvXv3Th0Bll9sNjs1NTU1NTUtLa2srIzuCT4+Pl5eXhgKlin4QQvPN3Q7c+bMpUuXpk+f3r9//4yMDDs7O/pglH379rU+Et3g3dy7d49uBZqamv369Zs7d64MXlhlMBhTp05te4+Xl9eLD9u7d6+vr+9rF5/l5eWtW7fOwcFh2rRpFRUVKSkpOjo675CqsbFRR0fn8OHDnp6ezs7O7/AKIH1//PHHhAkT3u2/OLxo3759kZGRa9euXb9+Peks8O88PT1bL3k0NDQsXLjw2LFjYrG4tLQUu3W35eTk5OTkRG883dTURPeEkydPrlu3Tk9Pb/DgwdbW1v7+/g4ODqSTKjvUA2VUX18vEAhMTEx279598uTJrVu3+vr6GhsbL1y40NfXl6KoWbNmkc4o94RCYXR0dHR0dH5+fpcuXfr16/fJJ59YWlqSzvVSDAYjNDT0tQ/j8Xg//vjjDz/8oKqq+oqHxcTEiESiDRs20CVTIpG87WajYrH4hx9+MDY2njlz5uzZs9/quUAQl8vdt2/ftGnTSAdRKF999dWlS5cmT558/Phx0lngNaZNmzZt2jR6/+hVq1bZ2tpGRkbSl+FIR5Mt2traQUFBQUFB9JdPnjzJycm5f//+kSNHamtr/fz86IM+2w41g9SgHiiL7OxssVjs7u5OV4ItW7aYmJgMGjRo6tSp9Ae4gQMHks6oCLhc7s2bN6Ojo+/duzd48ODg4OBNmza9+pO0fNHU1Hz69OnJkydfPdWBz+czGAw1NTX6y3c4iKCgoMDOzm7y5MnvERYIaG5uXrFiBekUCmjUqFGurq7jxo07c+YMTvaQfSoqKlpaWsePH3/69Ck9oLphw4b58+f369ePdDQZZWNjY2NjExwcTO9+lpycnJSU9P333+fm5vr/l5+fH+aXSge+ywpLKBQmJCSIRKK+fftev379999/nzt3LkVRU6ZMmTdvHv0Yd3d30jEVRH19/a1bt65cuZKRkTFo0KBx48Zt2bKFdKi3VllZ2XrbzMzsXx+jqak5d+7cyMjI7t27v2LQvFu3bufOndu6deuCBQv09PTa/W5dXV3b9xIKha23BQJBfX29iYkJi8XCRSN5ZGZmNnLkSNIpFJOTk9OJEycCAwPv3Lmjra1NOg68ESsrK3r20YYNGwoLCymKOnPmDJPJHDlyJD7pvoyhoeHAgQPpq5YtLS2PHj169OjRr7/+mpycPGLECDMzM/ogIPTkzoO/mgqlpaXl9u3bIpFo+PDh58+fj46ODgsLoygqJCRk6NCh9GOU9pT4ziCRSK5cuXL16tXU1NSJEyd+9tln8ru7q1gsppcU086fP/+yH139+vWLi4v78ccft23b9rJXCwwMnDZt2pEjRxITE8eMGTNp0iQtLa3W333xUFgnJyeRSKSqqioUCvFXVK7dvXtXU1MzICCAdBDFpKam9vDhw379+p07dw7/p8gXZ2dnegFV7969z5w5c/v27eDg4PT0dFwHeTV1dfW2c5DorcAPHDgwb968gICA3r17+/n50fOioQOhHsg9iURy+fLlsrKymTNnPnz4MCoqatKkSRRFjR8/fvz48fRjVFRUSMdUNDExMVeuXLl+/fqIESMmT57cbtcdecRgMNauXdv65asnRM2bN2/u3LkHDhwYMmTIyx4zZcqU/v37Hzt27Pjx47dv346MjGz9NPPZZ5/Z2tq2PvLAgQMikYjP52tra7dtESCPzp49O2LECNIpFFxMTMyQIUOOHDnyslE+kGUWFhatY/h37tz54osvTp48aWhoiGvhb4I+bW3GjBkURSUmJiYnJ2/bti07OzsoKKhnz55BQUF2dnakMyoC1AO5JJFIrl27lp6evnTpUj6fHx8fT0/X6927d+/evUmnU2S5ubnR0dFHjhzx9/cfMWLE5s2bSSfqMAwGo/XyzGvp6uouWbJkzZo15ubmr3iYpaXll19+GRwcvGrVqpMnT7auMPbw8Gh7wezkyZM8Hg+TJRRDcHAwNumXghs3bkyfPn3Xrl0dfqo6SNPcuXOnTJmioqIiFAq/+OKLTz/9VH6HoKUvICAgICAgPDycx+PFx8ffv3//2LFjAoGgT58+ffv27d27N66NvjPUA3lCn627ZMmS+vr6mJiYvn370nPBsdudFJw/f/706dN8Pn/q1KmXLl3Cj+Ru3bqNHDny4MGDr32kr6+vs7Nzbm7ui7/V0NBAfyfxj7jCaHssBnSq/fv39+rV68GDB6SDwHsxMjKib0yfPj0mJqZHjx7FxcXGxsb4KfPmNDU1BwwYMGDAAIqiSktL7969e/LkySVLlowcOdLDw2PAgAEWFhakM8oZ/EiWdaWlpQcPHmSz2RRFXbhwgd5v3sDAYNOmTRjBlwI2m/39998HBQU9evToq6++Onbs2OjRo/GvNi08PPxlcxtEIlFTUxN9u7Gxsby8/MVN/ZqamtTV1TGerkj4fH7bk1KgU6mqqh48eLDdcSUgv4KCgpYuXUpPEBg1atStW7dIJ5JLlpaWkyZN2rZtW0JCwqhRo4qKisLDw0NDQ3/++eesrCzS6eQGRg9kkUgkunXrlo2NjYuLy6FDhzQ1Neniu2nTJtLRlEhsbOyff/7Z0NAwduzYu3fvYouJF2lqai5btmzZsmUv/lZCQsKWLVuCgoKMjIzi4uLq6+s//PDD1t/l8/n0ptfSzQud7smTJ9evXw8PDycdRFm4u7tPmjRp/fr1bRcOgbyzs7O7ffs2/Vn2l19+6dGjh7+/P+lQcqlHjx49evRYvnx5Tk7OnTt39u3bl5mZOXjw4MGDB2MO5KvhE48MKS8vr6mp8fDw2LBhQ1NTE30VYfny5aRzKZ1z5879+eefVlZWs2bNwgYsr+bm5kYvhW9HX1/f0dExLi6OwWA4OjouWLDAx8eHRECQKk1Nzbb7X4EUfPjhh9nZ2cePH8chIQrGzc2NoqgBAwb8+OOP33zzTZcuXXCV6p25uLi4uLhQFFVWVhYdHb1jxw5dXV0bG5thw4Z5e3uTTieLGBKJhHQGZVdZWWlmZvbgwYNvvvlmzZo1PXv2JJ1IeR07dmzv3r0DBw785JNPFPhQdz6fX1dXJ/33pc9Opjcwfe2D9fX1ccgowBv64osv5s+fT38AAsXD4/EkEsn8+fMjIiJeceAMvLnq6urr169fu3atqqpqwoQJgwYNsre3Jx1KhqAekFRWVrZ48WJfX9+IiIiGhoYXT48CqTl8+PCuXbtmzZo1btw4AwMD0nE6F5F6IBaLuVwufUT3m0A9kEePHz8WiUSYCCF9aWlpkZGRb7JVAMivlJSUu3fvzps3r7q62sTEhHQcBVFWVnbnzp0TJ07o6uqOHTt29OjRampqpEORh3pAwKlTp2JjY7du3VpaWtrU1IQrAWSdOXNm165do0aNmj9/voaGBuk40kCkHnC53Lda0o16II8iIyNtbGymTJlCOogy2rhxo6en57hx40gHgU53+vTptLS0NWvWYM+3DpSWlhYVFXX06NFhw4aNHz9eyY9aQz2QnuvXr/v6+nbp0mXPnj1DhgxhsVikEym7e/fuHT161LxGy4kAACAASURBVMLCYv78+YaGhqTjSI+U64FEIpFIJG/7Ywz1QB5dvXrVwcHB1dWVdBAl1b1794cPH5JOAdJw/vx5FxcXen0CdKyLFy+eOXNGU1Nz6NChY8eOJR2HDNSDTldXV2dgYDBv3jwDA4N169bhE48saGxsXL9+fVNT0+rVq7t06UI6jrRJsx4IhcLGxsZ3mK+FegDwtq5evZqVlbV48WLSQUB6QkNDf/rpJ0tLS9JBFE1eXt6RI0euXLkSHh4+efJkZdvQHPWgE5WWlq5bt27ixInDhg1raWlRV1cnnQgoiqJOnDixc+fOtWvXhoSEkM5ChtTqgUQiEQqF7zaPE/VAHl26dKl3796txzyB9A0ZMuT48ePGxsakg4CU5OTk/P7775s3byYdRDHx+fzTp0//8ssvI0eOnDFjxsuO+lE8mLXWKaKiouj1LnPnzh02bBhFUegGskAgEMyaNauqqurOnTtK2w2kSSKRYI2XUtmzZ09zczPpFEptypQpx44dI50CpMfFxYXuBn/++SePxyMdR9FoaGh8/PHHt2/fdnZ2/uSTTzZv3lxbW0s6lDRg9KAjSSQSPp/ft2/fNWvWtD0ECmRBTEzMV199tWvXLhxlIBaLhUJhZ7/L119/PXPmTGtr63d7OpPJxKo7ufPrr79OmzZNSZb4yyYulztq1Kjbt2+TDgLSVlZWNmPGjCtXrpAOoshOnz4dExPj6Oi4aNEi0lk6F+pBh9m5c+eoUaO6du2KGREyaN++fampqVu3biUdRFmkpKTo6+sr8NkRADJr3bp1gYGBo0ePJh0EyEhMTMRVsE518ODB8+fPT5s2TYEvBOPiXMeIiIjQ0dFxcHBAN5BB//d//2dsbIxuIE2+vr7oBspGKBTu37+fdAqgQkND4+PjSacAYrhc7rZt20inUGTTp08/fvx4SkrKvHnzSktLScfpFKgH7+X+/fs//PADRVGbN2/+7LPPSMeBf/Hpp58OGjQIe4FL06RJk6qqqkinAGnjcDjHjx8nnQIoNze32NjY+vp60kGAjAEDBhgZGYnFYtJBFBmTyVy7du28efNmz5594sQJ0nE6HurBu2toaPjzzz9nzZpFOgi81Nq1a2fOnDl06FDSQZTIuXPnJk6caGpqSjoISJuGhsbs2bNJpwCKoqhevXrFxcWRTgHETJs2rbGxsaKignQQBefl5XXhwoW6urqFCxeSztLBsPbgXSQkJEgkEj8/P+xHJMtmzZo1c+bMHj16kA4CACBVFy9eTEhI+Oabb0gHAZJ+++03gUAwd+5c0kEU3717944cObJ27Vpzc3PSWToG6sFbS01N3bVr1549e0gHgVfZsGFD3759Bw0aRDqIcklOTqYoys/Pj3QQIKCkpOTGjRuYZikLqqqqwsLCrl27RjoIEFZWVqalpWVoaEg6iOLj8XiTJk369ttvvby8SGfpAJhc9NaYTCa6gYw7evSolpYWuoH0LVu2zM7OjnQKIOPZs2f3798nnQIoiqJMTU3t7e2LiopIBwHCunbtKhKJpLCTNWhqal64cCEyMpLD4ZDO0gFQD97Czp072Wy2u7s76SDwKiUlJceOHVu2bBnpIEqnpKRk/fr1ODFXaVlaWoaGhpJOAc/p6enl5+eTTgHklZWVhYeHk06hLA4ePLhgwYKsrCzSQd4XJhe9qd9++23w4MGOjo6kg8BrrFy5ctasWfgvBQDKbPfu3RoaGvhcCBRFXb9+3dzcHNM+pWbhwoVr1qyR63UIGD14UzNnzsQnTtl37do1FRUV/JciYunSpRjCVmZsNvvq1aukU8BzLBYrLy+PdAqQCUOHDkU3kKZ169Zt3ryZdIr3gnrwehcvXsSJWvLi4MGDS5YsIZ1CGSUnJ3M4HCaTSToIEMNms2NiYkingOccHR0xuQhanTp1CkuDpMbU1HTgwIFy/dER9eA16uvrL1y4sHjxYtJB4PWio6OtrKzMzMxIB1FGFhYWmzZtIp0CSHJ2dh41ahTpFPCclZVVeXk56RQgK9zd3Xfv3k06hRIZO3bsgwcPsrOzSQd5R1h7AIpj8eLFEyZM6NevH+kgAADk9ezZMyYmRk1NjXQQkAn5+fmWlpaampqkgyiL+Pj4s2fPfvvtt6SDvAuMHrxKXV3dqVOnSKeAN1JXVycSidANSFm5cuXTp09JpwCSkpOTL1++TDoF/I+RkVFtbS3pFCArHB0d0Q2kKSgoSFVVVU4HEFAPXuXUqVM4k1xeJCUlaWhokE6hvGJiYkxMTEinAJLy8vLoc/FARqAeQFt///33d999RzqFcunZs2d0dDTpFO8C9eBVLC0tP/74Y9Ip4I0kJSUFBgaSTqGkhELhH3/8getSSs7Ly2vIkCGkU8D/GBoaKsYJTdAhrK2tU1NTSadQLr169bp06RLpFO8C24y8yogRI0hHgDeVkZERHBxMOoWSYjKZLBaLdAogzNXVlXQE+AddXV0ul0s6BcgKNze3DRs2kE6hXExMTHR0dNhstpOTE+ksbwejBy9VWlqKZf5ypLCw0N7ennQKJcVms7FtEaSnp8fGxpJOAf+jrq7e0tJCOgXIEBwKJH39+vXLzMwkneKtoR68VHFxcXp6OukU8Ebq6+s9PT0NDQ1JB1FS1dXVWJcM6enpd+/eJZ0C/gf1ANqZN28elqNImb29fVlZGekUbw314KVsbW0///xz0ingjdTU1ODjKUHu7u4rVqwgnQII8/b27t+/P+kU8D8aGhp8Pp90CpAhT58+bWxsJJ1CuZibmxcWFpJO8daw9uClLC0tLS0tSaeAN8LhcDB0QJC+vr6+vj7pFECYu7s76QjwD2pqagKBgHQKkCG//PKLqakp6RTKxdjYOC8vj3SKt4bRg5dKTk6+ePEi6RTwRpqamiwsLEinUF4JCQnbt28nnQIIe/To0e3bt0mngP9BaYd2LCwsmExcF5YqfX19dXV10ineGurBS5WUlDx8+JB0CngjPB4PU2wJqq+vLy8vJ50CCMvKykpISCCdAv6Hz+djchG0NXPmTHmc6CLXtLW1DQwMSKd4a/9eIgWChpaWBqmHkS2Ojub6+kGNjaWkg8DrNTdXGxpqy+x/LB0dBZ+lFhQU5OnpSTpFp/t/9u4zrqnz7QP4lT0JG8JeCgiKoLjFiVZxDxwVZ9Va66rW1lZbV7dtta3+3VpbW62zbq3VKk6c4EAEZG/Czl7neXF8UooEQYGThOv78UXIOPkFOXfOdc49VKpyrVZOdQrT1bq1i4uLyGR3wxZIo5EyGBrT/B+h05k8ntMrvJAgtHI5Llf6iioqyqqq8mQy8zuZbb50Ol1xcaFp7oYAwOHYMpm8F++nEQTx4r2PH29NSfmdxeI3SzaEXteDB8rkZNXYsaZYoMvlxdHReBnKEsTFLS8ouMlk4jcrMg8XLsiYTOjdW0B1kJoIQk+nc6Oijr3Ca6XS7LNnx/J4dk2Qy/KVlGitrRlMJo3qIC2ITkesXStZtcqR6iC1UKmqOnb82Msr6sWHjHZBa906Kjh4XBMHM2lJSWk5OYWRkd2oDoJejk6/IpPdGzJkIdVBanHwYDTVEZrcnTt3bt68OW/ePKqDNLnQ0CleXjg5T+0eP06tqpJ17dqe6iDouezsPxgM+pAhJtcESaUFsbFfvfLL+XynqCgc7ITMA0EQW7e+M2SIKa6jFRe30dhDOPbAqNTUrKtX71GdAtULm810csKTSZSpqKjIysqiOgWi2IMHT7HNNCnW1kKBoJZuA6jFev/9ddnZ5jcHv1mj0Wju7s5Up2gwHMBuVLt2/mIxzv9lHtRqbVFRKdUpWq6OHTviktUoLKxN69ZeVKdA/6qqktXWfRi1XNnZBSoVTuPR3O7eTaQ6QoNheWCUl5erl5eFjyhFqFHY2NjguhMoMNCX6ggIobrs2LGGz+dSnQKZAexcZNTDh8n79p1au3bzG2/M6tZt4pAh73zyyY+VlVIAWLLkm5iYRlgjVqfTxcc/qXEnQRCnT8fOmvVpnz5Tu3WbOHHi+ydO/PP67/X6ak2LEDn2YONGo10YUQuRkJB05cpdY49+/fWOP/+8UP2ep0/Tw8Oj63iJwaNHKfU55bl27ebw8Oha/x06dK7Wl/TpM3XDhl9eumWELIOVleCff26RO0XnzuMHDZq9YsUP6ek5r7Y1lUrds2fMqlWbqt85ceL71ffWHTsOhYdH5+YWvvjyY8cuRka+VVBQTP4olcqSktJeLQlqdHj1wKiEhKQtW/7Q6fQDBnR3cXGUSMqePElr3H6ca9duSUx8duDA94Z79Hr9J5/8dO7cVW9vt+HD+xIEceNGwvbthwYPjqB8KZMX0yJEwrEHCAASE5/l5hZFRHQ09oTvvvs5PDzY3b1hKxieOPHP6tX/+/vvnRzOS+aM6tu3s4vL8+lBtm8/FBDg3atXOPlj27atG/SmCFmk999fFx4eDAAjR/b38nLNzy/+66/rly7d3rBhWXh424ZujcNhd+7c7ubNBMM96ek5KSmZt28/7NnzeTtw/Xq8j4+7m1stne85HJZQyKfTn5+nnjDh/YiIjngR0kRgeWDUuXPXtFrd9u1r2rXzb/SNEwRBo9FePB/2228nz527OmFC1HvvTWEwGOQ5++zsgmaoDchIdTzhlTssvnTLyNx16dKlTZs2VKdAFAsK8qv70F+hUH766cbt21eTjVs91b/l6dmzo+GgZOfOw4GBvjNnjq3/G70CbNyQecnOLmjfPgAAhg7tHRraBgBmz46eOfPTjz/ecOzYRh6vwf2OevUKj429k5qa2aqVFwBcvnwbAGJj75J7YlWV7NGjlJiYYTVeRe44gwZFDBoUYbhTrda82ofC3bApYHlQu8TE1CdP0saMGVhHbbBt24HDh8/rdPrIyK6LF08lF80+fvzigQPnUlOz+Hxut26h778/zdbWGgD+/vvGsmXff/vt0l9/PfH4cerUqSMKC0vOn78OAOHh0QBw/PgmJye7PXuO+fp6GGoDAGAwGN7eboY3VSpVmzb9fvbsVZVK4+XlMnny8IEDewDA77+f/Ouv65MmDd20aZ9EUhYY6LNixRzDC+/cebRx4+/JyRl2dtadOrV99903HRxsAWDcuMV+fh5+fh77959RKlVnz25NTc3aseNwfHwSAAQHt1q0aHKbNn4AsGrVphppXV2dtFrtli1/nDx5uby8ysfH7e23x/Xp0/nFD/vJJ3OGDOnd5P9niDpCoVAoFFKdAlGsffvAup/A43EzM/P27Dk2Y8boFx999Chlw4ZfEhOf8XjcXr06vvfeVJFIeOLEP199tQMAIiPfAoCVK+cOG9YXAPLyir7/fk9c3AMOhx0Y6DN37oSgoFbG3rewUPK//+2/du2+VCr38nKdPn1k9YMSg5UrN166dPvXX7/09HQFgEOHzu3de7KoqMTV1WnQoJ6TJw/ncNjl5ZWRkW8tXDj56dP0S5duBwb67Nix9pV+WwhR4OuvFz958qz6PTY2onnz3nz//XXnz18fPrwfAJw6dXn37qM5OYUODjajRkVOnz7KcIL/2LGL+/efzsjItbIS9OoVPnfuhJ49OwDAjRsJZHkQG3uXRqMZegzGxT3Q6/URER1f3HHc3cUnT14CgJs39zGZzKFD3yktrTh48NzBg+fEYoeTJzfXccBT4xjjp58+7tSpHRW/TkuG5UHtbt16SBCESGR0NZmkpDQul71gQczTp+m//37KwcGWPE318GGKt7drVFREaWnF/v1nZDL5hg0fGV719dc733134jvvjPf0dFEqVYWFktzcojVr5gGAg4NNUlJ6eXllTMxQY6fW9Hr9e+99lZdXPH36KDs76zt3Hn/88QaFQjViRD/yy/XXX4+vWPG2Vqv7/POtK1du3LPnSwC4devBggVfRkVFjB8/uKKiat++03PmrN6792sulwMAN27EK5Wq9es/lMuVQqEgL69YpVLPnDmGTqcfPHhuwYIvTpz4H5fLmTFjVI20APDZZ1vPnLkyY8ZoPz+PM2euvP/+uu3b14SFtanxYVu18myC/yJkQuLj42/fvj1r1iyqgyAqxcc/qaiQ9u7dydgTeDzOBx+8tWLFj927h9boQpCWlj1nzmo/P4+VK+eWlVVu2fJHQYFk8+aVPXqExcQM27v3xIYNy4RCvqenCwBIJGUzZqzw8BC///50Gg1OnYqdOfPTX3/9ys+v9qZGq9U9fpw6duxAGxurixfjVqz40cPDJTj4P+XEkSPnT526/O23S8naYNu2A3v3npwwYbCvr3tGRt4vvxzPyspfs2Y++eSdOw9HR7+xZctKBgPH7yFz4u3tlppasyMo2d3o0aPU4cP7nTx5adWqTYMG9Zw7d8LDhymbN+8HgLfeGgMAW7ce2L79YGRkt0mThpaWVty4Ec9isaysBEFBfjduxE+ePLysrOLhw+To6DcOHDiblJQWGOh7/fp9kUgYEuJfVSWrsePQaDS9Xn/6dCyZ4Ztvlsyb93nHjkGTJg1ls1kvPeCpfowRGOjT7L9Iy4flQe3y8yU0Gi0np5bBNCR3d/HWrasYDMaQIb3T03PPn79BlgcffzzbcJGLyWTu2nVEpVIbusyOHz9o6NA+ho3Y2IhKSirIC3wAkJ9fDAC+vh6GJ0ilMplMAQBcLsfa2urixbj795NOnNjk6GgHAIMGRcjlyn37Thn2lvXrl9nb2wDAhAlR69fvqaiosra2Wrdu9+jRkR988Bb5nK5d248du+jGjfi+fbuQIb/4YpHhkuLgwRFRUc9XfQoK8pszZ3V8fFLXru09PV1rpM3IyD158tLMmWPffnscAPTv33XUqAVbtx7YsmVlrR8WWbCSkpKUlBSqUyCKPXmSlptbVEd5AACRkd0uXbr1ySc/7d37dfX7d+48QqfTN25cYWUlAACRSPjppz/du5fYoUMQOWV427atbWxE5JN37DhkZ2e9efOnZK/LqKheI0fOP3r0wvvvT6/1Td3cnA8eXE+2zCNG9IuMfOvSpVvVy4OkpLRvv909Y8Zo8vpncXHprl1HP/98Yf/+XcknODrafvnldsP227Xzf/fdN1/7F4ZQc5sxY0VkZNcadwqFAiaTWVFRRRDEpk37QkMDP/tsIQD069e1slK6Z8+xiROjpFL5rl1HoqJ6GYrkKVNGkDd69QrfufOwUqm6cuUul8uZN+/Ns2evxsbeCQz0vXEjoXv3UMMZzxo7jq+vu+F2UFArJpPh4GBrOMZ46QEPHmM0KSwPakcQBEEQhu+GFwmFfMNfvJ+fx8OHyeRtjUazf/+Z06djCwokXC5Hr9eXlVWIxc9Hy3XuXNf1L71eDwB0+r9d6H777dT27QcBYMCA7l9++d7Vq/e0Wu3w4e8anqDT6YVCvuFHHo9D3nBxcSC/5ORyZXp6TnZ2wdGjf1d/r8LCEvJG27atqnc3pNFo//wTt3fvyfT0HHL6s5KS8lrT3ruXSI4FNLywa9eQ06evGJ5Q94dFliQ0NNTd3b0eT0SWLDi4lYfHy4cdf/jhzHHjFv/002/Dhv371X737uNOndqStQEAdOvWnhzr3KFD0ItbuHbtfmFhSUTEZMM9Go22sFBSx5smJ2ds3XogMfEZ2WxWb9aqqmQffvg9m82aNev5QIW4uAdarXbFih9WrPiBvIdcPaCoqITslomNGzJTMplCp9PXuFMuV2i1Wi6Xk5WVX1xcOnnyv0MFunULPXbsYlZWfkpKpk6nGzt24IvbjIjouGXLH3fvPo6Nvdu9eyifz+vbt3Ns7N2+fbsUF5eSvY9IDdpxXnrAg7thk8LyoHb29jb1X+iOwWBotTqyqFi06KvExGezZ0eHhARcvBj3yy/H9Pp/l6Xh8+ua+Igskatfshg0qGdQkN8nnzxfPb6kpNzBwdZwep7EZNbSE4nFYlX/Fpw9O7pfvy7Vn0D2DiJ7A1e/f8eOQ1u2/DFxYtT8+W9KJOXLln1PFi0vkkrlAGBnZ224x9raSi5XyGTy+nxYZEns7e3t7e2pToEoFhISUJ+nWVkJVq6cO3/+59XXnZRK5ba2IsOPIpGQPMFR6xZKSsojIjrOnz+p+p3VjxtquH374fz5X4SHB69cOVcg4C1d+m31ZvnEiUve3m5yufLw4fMTJkSRnZcAYMOGj5yd//NX7e7uTLZ7hhMxCJmXL79c9PRpeo07c3OLAMDFxfHFr3Wyi3VRUSl5LFFjjyAFBPg4OztcunT75s2E5cvfJi8SHjt28dixi3Q6vXv3MMMzG7TjvPSAB48xmhSWB7Vr3z6AIIjNm/f/8MPH9X/VvXuJt249/OyzBeS4t6ysly9dTlRb0zIoyI/L5Zw9e5X8ijIszUb2wyO/MsvKKl1cHF86wZ8BeTZOqVRVH99sjEql3r376MiR/ZcsmU6O56sjrZOTHQBUVFSRVQ25MzOZTHJIQzNjsZjkWT1Eidu3b9+4cWPBggVUB0FUevQopbJSWv1owJiuXduPGTNg06bfDfc4OdlVVFQZfiwtrTA0X6TqjY9IJCwvr6pPm0baseOwu7vzhg3LyM5INc6JuLo6bd26cvv2Q1u2/PHGGz1sba3J4oTsqF3PtzBNAgGPwGWTUTW+vh5paTVXOSDXVurQoQ159F9eXnNPFIkE5M5YUlLu7OzwwlYhIqLDn39eBADyWkGnTm2tra0OHjzXvn2AYW+qjxq7eUMPeExWSEjjT4DZ1HBYVe06d25nb29z40ZCYmKq4c6bNxOkUlkdryJ3KsOQu/LySkOXoVrxeJySknLDE7hczsiR/R89Stm69YDhOTqdznCiq3Pndjqd7tChvwyPKhTKuj+Ip6eLWOxw/Pg/hmdqtVqNpvbpwxQKpUqlbtPGkL8KAAzvXiNt27ataTTa1av3yB/VavXVq/dCQvwbNGVhY9FotOQJP0SJysrKnJxXXFgHWYyHD5OvX4+v55MXLZpS/epBSEjA3buJSqWK/PHChZsAEBoaaDiaLy7+dwfv3LldQkJS9QlY6m4Jy8ur/P29ydpArVbL5YrqzXKfPp1sba3nzBnPYDA2bdpHHtzQaLQ//jhTz+2bLJlMIZOZZXLURObOXVNUVFL9nsTE1CNH/vb19ejUqZ2Dg62Li+O1a/cNj/79900ulxMQ4EOuikDWACStVmu4HRHRUavVdujQhqwimExm376ddTpdHaugvIjH40ok//b6e4UDHpP14EEy1REaDK8e1I5Opy9ePHX16s1vvfXJwIE9XFwcU1OzLl26NXfuxFqn5CO1a9eazWZt3Pj7qFH9U1Iyd+/+EwBSU7OMzQXeoUPQ8eP/fPHFttDQQJFISE4TlpCQtH37wcuXb3fpEqLX669fjy8vrySXY4uKijhy5PwPP/yal1cUGOiTnJzxzz+3Dh3aUMcJexqNtmTJtKVLv502bfnYsQN1Ot3Jk5ejoiLefHPoi0+2sRG1auW5f/8Ze3sbqVS+bdtBOp2empppLO3Qob23bj2g0+nd3Z2PHr1QUlK+du38V/p9I/MWFhbm5eVFdQpEsXbt/L28XOv5ZC6Xs2bN/Lfe+oT8ccaMUefOXZ0///MxYwYUFEi2bTsYHt62Y8dg8loug8H49tvdw4f3VanUY8YMnD07+urVe++++1lMzDA7O+vr1+/rdPrvvvvA2HuFhwefOHHp2LGL1tbC3347WVkpe/Ysu8Zc6SKR8J13xn/99c7RoyODglpNmDB4377T7733VZ8+nSWSsgMHzv7ww0e4YBMydyUlFeTYg0OH/oqLe5idnX/hwk2BgP/55wvJ3eHtt8etWrVp7drN3bqF3rr18NKlW7NnR/N4XC8v11GjIo8cOV9RUdWtW2h5eeXhw+e3bl3l6upEVtRcLocc2U+KjOz2558XGlQehIW1OXv26s8/HyUnO3qFAx7UiLA8MGrQoIhWrTy3bDkQG3tHqVR5eLgsXDh54sSoOl7i5GT/+ecLv/vu5w8+iA8J8d+6deWWLX/s33+m+j5TXVRUr8TEZ6dOxV65cnfYsD69eoXz+bxt21bv2XPsr7+u7dt3mstl+/t7jxs3c+TIfuSIgk2bVvz00+/nzl07cuS8p6fr2LEDax17UF3fvl02bFi2ZcuB7777WSjkh4UF1jraj/TFF4tWrdr00UfrPT1d33tvSnJy5r59pxYsiGGxWC+mXbZsllDI/+OPM5WVMj8/j/XrP8S5h1smOzs7Ozs7qlMgijV0ZeJ27fynTRtJ3vb0dN24ccVPP/22evX/+HxeVFSvRYsmk8cr7u7i5ctnb9q079tvdwcG+o4ZM9DdXbxr12cbNvyya9cRGo0WGOgzfvzgOt7onXfGSyRl69btEomEo0dHxsQM++KLbXfuPKrRXo0ePeDIkb/Xrdu9a9dnixdPc3Z2+OOPMzduJDg42PTt25nsTomQWTOMPTh79iqHwxaLHcaOHThlyghDJ+GhQ/solarffjt56lSso6Pt/PmTDDMUffTRLFdXpyNHzl++fMfJya5bt1DD4Qebze7atX3v3uGGN+rUqW1wcCsfnwZMWbFgwSSJpGzHjsO2tqLFi6e2auX1Cgc8qLHQau2Y+PjxVoDy4OBxVESiWEzMhzQaTafTKRRKvZ4QiYQ6nU6hUB09+iPV0ZBRZ85cuXbtHjkXm6k5eDA6OvoO1Sma1oMHD+7fvz916lSqgzStuLjlYrG/l1cvqoOYlokT32exmGq1RqlU02hgZSVQqzUajfbw4R+ojtbSbdt2gCCAnHvapEilBbGxX0VFHXul12bHxs6NisJv5AYYO3YRi8VkMhk0Gk2j0ZJjfFks1q5dn1EdrUUID4++c+cg1SlqERe3USzu7+VVy4lvvHpQk62t1fXr8YaLzrm5heTYNapzIWS6iouLHz9+THUKRA0+nxsfn1S9o06NGc0RQtRSKlUZGbk17pw8eThFcZAZwKHJNU2bNvLFgfajRvWjKA5CZiA8PHzevHlUp0DUmDq1ZpvJ1T4qdQAAIABJREFU4bAnTKirtw9CqDmFhATU6Cri7u48ZQqWB8goLA9q6tixbdu2rarvSK6uThMmDKE0FEImzdra2tPTk+oUiBq9eoUbpjsjubk5jR5dy/JJCCFKxMQMq9EJYvDgnra21sZfgVo6LA9qMXXqSHv756uGMRj0ESP61ZgnGyFU3aNHj/bu3Ut1CkSZSZOGGtYoYLNZdQ8URgg1s6Agv3btWhvOe3p4iMePx5OeqC5YHtQiPLxt27atyNseHi6TJg172SsQatEKCwsfPHhAdQpEmR49OgQEeJO33d3FY8bgpQOETEtMzDAXF0fy9qBBPW1srKhOhEwalge1mzJlhL29DZ1OGz68L5dr9iv2IdSk2rdvP336dKpTICpNmzbSykrAZrPGjcPaACGTExTUqn37APLSwbhxeH0PvYTJzVwk1+mrtDqqU4BLm1atO7XLyiroM3JAoar2NYabE5MG9mwW1SkQqp2Dg4ODg0M9nogan1Srk+mMLs3ebHzCgn07BJeVVUYM628KbSaLBnbYZiLTUKhSA9Dq8cSmFfXm0FvJmb2ieqv5PMp3UoIAJw6TTqP+14JqZULlweHc4qP5Ej0BeqhlKQYKjIjkA3zwNIPqHAAAblxOplzZ39HmHV83qrOYIjab6eBgS3WKFmfYsGF5eXnkbcMatARB3Lt3j+poLcL+7KITBSUsOk2pp748AACYMIQNsPBBCtU5AADEHHaeUj3A0WaWT30XcrY8QiFfbyJ/Gy1SpUa7NSPviqQySMTPkCupjgMAYP/BrDiAOBPYSa2YjCy5qp1IMM7NsbOdiOo4TSssrA3VERrMVMqD71KypRrtm+5OeL7HGKlWly5XTL3zZHuHADYde4X9h1qtlUjKqE7R4kydOnX9+vUqlQoADLVBly5dqM7VInyWlMmkQYyHky22mUZUarRpMuWc+OSN7VszW+RJSqlUXtvCp6g5FCrVs+8nv+nhtKSViMPAr+zalajUe7IKStSawWJ7qrM0ofv3n1AdocFM4k/225RstZ54Q2yPtUEdhExGO5FwiNj+7fvJVGdBCABg7Nixrq7/OTUrEommTJlCXaKWYm1SJo9B7+Noi7VBHUQsZqiNsLutaH4C9edKUYui1Oln3Ev6KMDTi8/F2qAO9hz2ZE/x2cKys4WlVGdB/0H9X+3jSlmlRtvX0YbqIObBk89tJxIcyS2mOghCAADjxo1jMv+9CNmmTZtu3bpRmsjy3SmrJAiihz3OWV4v/lZ8Xz7vTEEJ1UFQC/K/tNyJ7k71eCICAJjo4XS6oERlAmOokAH15UGKVNEir/q+OmsWM6FSSnUKhAAAoqOj3dyej4cRiUSTJk2iOpHleypV0LDRbAghk5FQKaM6BWpBbpVVOXJwzsMGUOj1aXIF1SnQv6gvD0rVGhcOh+oU5kTMZWuwPykyGWPGjGEwGADQunXrHj16UB3H8lVoda48PPJoADGXo9Jjm4maiVyndeKyrFmmMrbTLLQScHOVaqpToH9RXx5UaHV4sNsgegJyFbgXIVMxYcIEd3d3Pp8fExNDdZYWoVyt0eLBbkMQhD5foaI6BWop6DR6uswk5ikyI5VavQabNVOC1S1CLUu+UlahUQda2cZXFCdUlBSrFO2tHQY4eSRUSM4XZb/a7S7jR1+tkigDvAHgdbaTUCG5WVbYTmTf3U6cLC0HgvC3wvlqEUIIoWaF5QFCLYJUqxEyWYdyU08XZOgBGDRamVpVpdPQaPBXUda6lHsEwKvfdhPSaMJ1KfdedzsAAHAyP92WxaHTaHqCCBbZLWoVKlEp3HhCqn+FCCGEUIuA5QFClm9z2sMHlSUVGpVE/Z9L3jQAwyKEpnIbQKXXFajk5D0FxfIrJXnefJE9m/uhfwc+A+fxRAghhJoWlgcIWbJ8pexCcfaZgkwloaM6yytS6/XJ0nIAWJd8f5JnQCsBTuiJEEIINSEsDxCyWHsyn9yvKE6sspD1pK+V5j+skHgJrL5rF0F1FoQQQshiUT9zEUKvj81mOjriGNb/uFNWeKYww2JqA1KlTpNUWbY+NZ7qIAiZASsrgVDIozoFQi1dWFgbqiM0GJYHyBKo1driYos6Dn5NMq3mh2cJpRoLnABXA8SZwsxbZYVUB0HI1FVVyaRSXGoKIYrdv/+E6ggNhuUBQpbmQmH2u/GXClWWfFiw9smtZY+uUZ0CIYQQskBYHiBkUYpU8t3ZT/L+f+YfS6Ui9Jly6T/FOVQHQQghhCwNlgcIWZRKjbrIoq8bGJRolElS7FGGEEIINTIsDxCyHAVK2Q/PEqhO0XxO5Kfvz0mmOgVCCCFkUbA8AGlF5Z3Lsa+zhcLsnCf37jdeIoRe0bb0x0+l5VSnMCph+de35y1vxA1qCeJ4fnrZf9d6Q00N20yETFxywsP8zOxXfrlGo4m7+I9apWrUUMictPTyoKSwcP6w0Ud27nnlLdz8++KScZPuXL7aqLkQajC5TqPSm/TaZ5XJz4Q+Ho27TSaNrjesuoyaHraZCJm43eu+XzNnXm5GxitvYfnkt35avkqjtsC571A9tfTyQKvWaDSa19mCQiZrvDgIvTo+g1WiMd3z6JoqmTK/SOjj1bibLVDJ5TqTLoosDLaZCJm419/FcCdFLWjVZI1ave+nzXH/XFLK5S6enqNmTPUO9F8ybhIAZKWkxnTrDQA/HDto7+R08+9/Du/YVZxfwGKyWrULmvDu216tWwPAmf0Hf/thY8deEXJp1bPEJ1wud+zbM3d9/R0AnDtw6NyBQ05urt8f2kf1B0Ut1J95aWmyyibauF6tydh3NP/sJWVhMcfBzmviSI9RgwEg47cj+eevtFk8O3nTz1Up6RwHu8DFsx26diBfVXT1Vsavh6tS0zkOdm5DIgFA6OvZ6Nk+fHTt905vNPpmUXF+/q/rf3pyL4FGp/kG+k9evJDL59XaZv6xedu1M39VlJUKRKL2Xbu8OX+ulY01AKz/cPnd2KsDxoxKvHuvMDc3MLR9z8GDdn71LbaZCDWKxLv392/akpOezhcK23bqOOODJXu+//H6ub8BYMOyFQDQe9iQWR9/oFapfly+8tnjRLlUau/k3Gvo4GFTJjEYDACYFTlEIZOOmD7l6qmzZSUlo9+aFnvqTJlEAgBvDxwKAHM+Xd5z8ECqPyhqbi2oPDi2+5e/Dh2xF4vbdgrOSE7RqNUcDje0e9f46zf5QmFI104AwOFwAUCr0ei0Wv92wZKCwodxtzNTUjcc3s/mcsnt3I294u7j3bV/Pxt7Ozcfb582AelPnoo93b39W1vb2VP9KVHLdVmS20Rb1ms095asrnic7DEmSuTvW3zjbtJ3W21Cgqz8vHRKlfRZxuOvNnqMjnIZ3Dd1669JG7b33L8ZAHKOnXuybrNdeEib9+dUPctM3bYXAASN3bkIAHQEkSWv9OSLGn3LLdzm1Z8nJzz0aRNgbWub8ugxT8Bnszm1tpmyikorG2tnD7fctPQrp89KKyuXrPvSsJ3zh4+27dTRzden24B+IltbbDMRahRyadV3S5epFMr23btoVOqMpGQ2l+sX1ObR7bslBQX+7dvZOTr4BbUBADaHIykoFLu70xmMZ0+SDm3byeZyoiaON2zqxC+/dY3sW1FSHjFkUFV5xcVjJzQqVXjvXkwWw9FVTOmnRNRoQeVBdlo6AIyZNb1X1CCtRkMAsFisyYvmx1+/6eAinrd2leGZPQYNMNTK5NmvxHv3Q7t3I+9xdHVZs2uroVroN2LYzidP23ftOvm9+VR8LISeE3MFj6tKm2LL6b8eLrv/qP2XHzlFdAEAnqu44K/LquISKz8vrVzB5PPCN37OsbMBgKqnz/JOXwAARX7R0x92OvXqGvL5hzQaDQCkzzKlaZls68Y/iHfjCu3ZvEbfLMpOTQeAhV+scRCLlXIFl88DgFrbzOkfLiH/l5VyxdLxMfHXbshlMr5AQD7aNbJv9Sdjm4lQoyjKK1AplE6urku/+4bc+wCg38hhSfHx1wsKoiaOD+8dYXjyl7/uInfSjOSUFVNn3jh/oXp5MHXxwv6jR5C3J783/9bFS2Uq1azlHwisrKj4ZIh6Lag8CO3e9W7s1d9+2FhZUtpv1DC+0OgffZlEcnzP3oe3bpcWSWg0AICivHzDo2E9uhtqA2QiaDRgMhlUp6CYF79J2nGCILKPnOGKnaz8vFWSUml6VurWvWxba5uQNgAgy8gW+HqStQEA6BRKlsgKAHJPnNdrta3nTiW/kABAK5UJfRq/ZxEAyPUaAZPVFFtu4cJ6dr1+7u91730wYtrkLpH96nhm+pOnx/bszUh6WllRQeh1BEGUFBTy/XzJR7vW+VrUdGg0GuDAfcvl5u3l5OpalJe3bvEHw6fGBLQPqePJcRcvnT94JC8rS6NSA0BxtaMaAKh7B0evic02v2+oFjQ0ue+IYVOWLCQI2P+/rYtGT0y4frPWp8mqqlbOmPP34T8FQquIQQNdPD0BQCX/d8Qnj4/nKU0OQYBW29LHpzbREsLK/CJNeYWmsurquLdjR864994qBo/bYcMaJp8HANL0LKH3v/2FZNl5fE83ACh/+ITr5MB3dyHvJwhClpnTROVBsUpxq7SwKbbcws1ctrT/6BH52Tn/W/XZR5OmF+Xl1fq05AcPV7899/616+6tfCKi3hBaWwOASvlvm8nl85sxNfoXQRAEVgeWi8Vmf/TT98Hh4Qk34tbOmb/+w+XGpiI9tff3jStWZaWmBXfs0G1APxqNplT8Zx4LLh7YNCW1+rWmc6BEC7p6AAADx46OGDzw8I6fz+4/uHnN5xtPHCHv1+v1hufcvhRbJpGE9+616Ku1APDn7l8yk1Ne2sQS1baAECW0RJP8Eeq1WgAImD/DNjRYUynlujgZrhVoZXJVUYng/w/6Cb1elpHtOrgfAKjLyjn2toaNVCWn6eSKphiXDAB0oLHotKbYcgvH5nKnL10c9eaEXV999/jOnb0bNi7+5gvyoept5sU/j+u0uimLFwyMHgMABdm55ZISbDMRagaOri4f/fTdk/vxW9d+eTf26oUjxwZPHEc+RFT7Rvjr4FEA+HTLjx6t/AiCuHzyDK0ehSOhx+Ky5WpBVw/UKlVpcTFPIIxZOI/HF0grKmWVVVyBAABKCorUSiW5FIhSLgcAJ9fnZz1THj4CAL3x6eR5AiEA5GdlkV+ZWq22GT8TQv8a5eLbFJvlOjsAjVaZnMb3cLUO9jfUBuSlAwAQeruTPyryCvUqtcDbAwBY1iJFQRHx/1OOZvx2BAAETVMehNs6hVo7NsWWW7jS4mK1Uuns5jrh3dkAkJ+VDQAvtpkKmRwAHFxcAEAhk+ekpgKA3vhss9hmItRYCnPzAKBNWOjA6NEAkJ+dDQA8gQAA8jKzyT0UABRyOQDYu4gBIO1Jkl6n09U5H/TzLWRlGbaAWpoWdPXgxvkLv3z/g39IO7VKrZDLXLw8re3tAMDJzbUoN2/p+Mk8K+GgcWMCQkIA4K9Dhwtzc0uLitOTnhq+yWrlGxRAZzAe3rqzLGa6Qir7+KfvnT3cm/eTIQQA0NfR/ce0B42+WQaH49SnW+7xczQGXRTYSvos07ptgHPvbgAgS88GAMPVA7JaEHi7A4BTz87Jm35+/MWPTr26FV2JK7p8AwCqd0NqRMEie8MIB9SIDmze/vDW7VbBQXmZWQDQpkMoAFjb2dZoMwND29+Nvbr9i68D27dPS3paWV4BAPmZWcZ6QmObiVCj0Ov1Xy1YzGKx3Xy8k+LjASCoQxgAtG7b9sKRY4e377pzOVatUn/928+BYe3vXbm2euY7Yk/PxDv3yNcW5OSK3d1q3XLrkLZ5mZnfLv7Q2cPNw89v9vIPm/3DIYq1oKsHVjY2Ynf3xLv3ctLSOvbqueTbr8j7313zqZd/64qy0rLiYqG1tU+bgFnLP7R3dn5wIw5osHT9Olcvr7QnScYKaCdX15kfLbV3ds7PzCT0ehaX07wfC6HnHlWWODbNBD5BS98RR0YUXria9N3W8oREvrsreb80PYspFHAdn89NSVYLZA3gET3Uc8KIktsJjz7bQOj1Tn26c50dmYIm6YN+sWkGXSBXLy8mi33/2g2FTDZgzKg3580l76/RZg6IHj144jg6nZ5w86a3f6vF33whEFk9jX9obLPYZiLUKFQKZZsOYRVlpfevXROIrKYsXkBOA9D9jciB0WP4QkFOappQJAKAaUvf69grorS4JPnBg97Do6YsXsDhcZ/cvWdsy+PmzArt3lWn0+ZnZllXu2KMWg5arT1EHz/eClAeHDyuGRKsT81h0midbHHyrPoqVmkO5RXv6RhIdRATcubMlWvX7n322UKqg9Ti4MHo6Og7zfBGFWrVOwmXJGrTXTi5KQgYzOmebYa7NknHqhri4paLxf5eXr2a4b3q8MXTTHs2q721kNoYZiRPofqrqGxLmD/VQZrbtm0HCALefrs5vsobRCotiI39Kirq2Cu9Njs2dm5U1I9NkKtxKPX66LjHywMaeYV4y3aioKSnvfVgZzuqgzSJ8PDoO3cOUp2iFnFxG8Xi/l5eUS8+ZGadiwiC+Oa9pcYepdFqr3YAwL9d21FvTWvKaM9dOHr8zuXYWh9isdgajbrWhwRWVvPWrmziaMjCWbM5s73bfpFcVynydOPuvJN/v3i/lb9vVXJarS/ptOWrRuwUdHvuR9K0Wrrq1RGg19GdDJ7RqYSHiH2apzYwU2qVav2Hy409amVjU1VeXutDJt5m2jo4zF6xrImjIdQcUh49PrJjd60PadQalvE5MactXezs5tqU0Z77+dvvC3Nqn7iMyWQaGz4UOXpkx149mzgaairmVx48jLtt7FEbB4dyiaTWh9icZrp+nZeRaSwhl8dXKuS1PkSOgkDoNfVxdLtWml/H8sk+k8d4jhlSywPGp0fnODbmH2e7NUsJTW3fJcYD0I13PuHRGf0cau87i0h6nb6ONlPs6VGQlV3rQybeZjob6TONkNmpLC0ztguwORxjc5UCgEIma8pc/0p+8DgrJbXWhxxdXWosoWAQ1qNbE+dCTcjMygM6nb73xmWqU9Rl8nvzcSlQRCGVrq55YNjWImiCdYvrj+vQmMWGNYvjyMHpuuvC5fOwzUTIlHXs1dPEd9IvftlJdQTU3FrQ0GSEWoIP/Dt6N83yyabGhcNf5NfeisWmOghCCCFkUbA8QMiiCJmsbWH9Ots6Ux2kaVmz2Ova9ehg60R1EIQQQsjSYHmAkAWKsHcRMS35tLovX+TEaZJpUhFCCKEWzszGHiCE6uMNZy8GjbYvOyVbKaU6SyPjM5h+Auuv2/agOghCCCFkmbA8QMgyRTp5thLY/JKVdLW09mklzJGIyV4f0tOD1yIGVyCEEEKUwM5FCFksb4HoQ/+OHjyhLdPsF6Zl0+hOHN4cn7ZYGyCEEEJNCq8eIGTJOAzGzg79b5YW5Cvle7OTqrQaqhM1GB2gq53L2z7BOoJw5+FSwQghhFDTwvIAIcvX1U4MAN58q4O5qUwaXabT5Cikcq1WRegAAGi0/1+SjDCF22wanU2nO3B4AULbNFlFBxvHyZ6BbDqjaX9HCCGEEALA8gChFiTMxjHMxlGr1zPp9LvlRc+kFV3txJ58q1MFGVmKqqFiHw+e0BRuny3M1Oh0fZzcrSx68iWEEELINGF5gFDLwqTTAaCjjVNHm+eLBgwRexseNYXbg5y9XvXDIYQQQuh1UT802ZrFYNFpVKcwJ3QAD57ZjzRFCL0aWzaTRcM2swFoNHDl4ZUo1Ez0hN5PwKU6hZmxYjDY2KyZEurLAwc2K1ehojqFOclTqdlYUCHUUtmzWLlKNdUpzEmeUs1j4NgV1Ez4DGaxSlumNr95ICiUIlN48PG8pwmhvjwIEPL+f0giqpdKjbajDc7tiFALFSjk6wg91SnMiUyrC7PGOa9Q8+lmJypSYXnQAEIm3V/IpzoF+pcJlAdWAjc+52xBKdVBzMPjCmm6XDlEbE91EIQQNUJshNZM1oXiMqqDmId7ZVVFKk2kky3VQVALMs/P7becIr0ey/h6+SWrINrNieoU6D+oLw8AYI6Pq6+AezyvOEeh0hN4KaF2RSr1/fKqhEr5jyGtqM6CEKLSe63d7VmscwWledgz07hCpfpWWWW2UrWurS/VWVCLc7RL0KqnWSlVinKNluosJkqp02crlDsz8qd7uvS0t6Y6DvoPU5m5aIqX+EJR2ZF8SbFKrdKZRIWgJwhCTzAYJlFBefK5Mp0u0tHmx/ZYGyCEYI6v65mC0uMFkgqNTqbVUR0HAECvJwAIOt0k2kw3Hket1w9wsl3g5051FtQSCZjMv3qEbEnLO5NV6sJlp0oVVCcCANBqdQwmwxQGL1qzGBUaXUdb4ZLWHoFW2K3I5JhKeQAA/Z1s+zvZEgAynUl81Z07d/X+/aRly2ZSHQQAgAl0rkns0QghUzFYbDdYbKcHkJtGm3n48F95ecXz50+iOggAAJtGxykcEOXm+LrO8XWVanVgGn+MM2asWLHibV9fD6qDAEGAFRMnDDBdJlQekGgAQtOYYoKjJ5garYmEQQihWtFNps1k6/Qsrc5EwiBkOoQmcxxMU6r5NBrupOilTOIqMEIIIYQQQsgUYHlgFIvFssH5QxFCqH44HLZIJKA6BULIKHd3ZxMZHYRMHP6VGKXRaMrLq6hOgRBC5kGlUldWyqhOgRAyKienEKdbRfWB5YFRHA7LwcGG6hQIIWQeeDyOrS1ecUXIdPn5uePVA1Qf+FdilEqlkUjKqU6BEELmQaFQlZXhFVeETNezZzl49QDVB5YHRnE4LHt7vHqAEEL1wudzbW1FVKdACBnl5+dBo5nGHKvItGF5YJRKpSkpwasHCCFUL3K5sqyskuoUCCGjnj3LJgiTWHkWmTgsD4yi02lsNovqFAghZB6YTAa2mQiZMoGAR3UEZB6wPDBKryfUag3VKRBCyDxotTpsMxEyZTKZguoIyDxgeYAQQgghhBB6DssDozgcNk5sihBC9cTn83ApSYRMGQ5NRvWE5YFRKpUaJzZFCKF6kssVuJQkQqYMhyajesLyACGEEEIIIfQclgdGsVhMkUhAdQqEEDIPHA7byopPdQqEkFEuLg64ajKqD/wrMUqj0VZWyqhOgRBC5kGlUldVyalOgRAyKj9fgqsmo/rA8gAhhBBCCCH0HJYHRtHpdA4Hl/hBCKF6YTKZ2GYiZMqEQlwWDdULlgdG6fV6lQqX+EEIoXrRarXYZiJkyqRSXBYN1QuWBwghhBoHTqmOkCmj0Wi4k6L6wPIAIYRQ48Ap1REyZQRB4E6K6gPLA4QQQgghhNBzWB4YhUOTEUKo/hgMBofDpjoFQsgoXJkE1ROT6gCmC4cmmxE2m+XoaEd1CoRaNJ1Op1KpqU6B/mVlJdDrsScJ+heuTEKJDh3aUB2hwbA8QJZArdYUF5dSnQIhhExIVZUM+5kjRLl7955QHaHBsHORUSwWy8bGiuoUCCFkHjgctkgkoDoFQsgod3dnOh0P/NDL4V+JURqNpry8iuoUCCFkHlQqdWWljOoUCCGjcnIK9Xo91SmQGcDyACGEEEIIIfQclgdGcTgsBwcbqlMghJB54PE4trbYIRMh0+Xn546di1B94F+JUSqVRiIppzoFQgiZB4VCVVaGHTIRMl3PnuVg5yJUH1geIIQQQgghhJ7D8sAoOp3GZuPErwghVC8MBh3bTIRMGZ/PpdFoVKdAZgDLA6P0ekKt1lKdAiGEzINOp8c2EyFTJpcrCVwLA9UDlgdG4dBkhBCqPx6Pa2srojoFQsgoX18cmozqBf9KjMKhyQghVH8KhbKsrJLqFAgho9LScGgyqhcsD4xiMhlCIZ/qFAghZB7YbBafz6U6BULIKCcnOxx7gOoDywOjtFqdVCqnOgVCCJkHtVojlyupToEQMqqoqBTHHqD6wPLAKLx6gBBC9cdms/h8DtUpEEJGOTra4NUDVB9YHhiFVw8QQqj+1GqNXK6iOgVCyKji4nK8eoDqA8sDozgctr09zlyEEEL1wuNxbWxw5iKETJevrxvOXITqA/9KjFKp1CUlOHMRQgjVi0KhLC/HmYsQMl1pabk4cxGqD1zhsqYZM5YnJDwFABqNRhDEyZOXCIJwc3M+fnwT1dEQQsjkxMR8mJiYSnZoptFo+/adAgA3N6fjx/9HdTSEEABAx45jDUc1Eya8T/YvGjWq/4oV71AdDZkovHpQ09SpI21sRIavOgCg0+mRkd2ozoUQQqZo0qQhQiGfRqMZmk06nR4V1YvqXAih5zp3bkfeMOyk7u7iyZNHUJ0LmS4sD2rq3buTr6979Xu8vFzHjRtEXSKEEDJdgwf38vJyrX6Pt7fb+PFR1CVCCP3H5MnDq48LIgiiR4/QGrstQtVheVCLSZOGWltbGX7s3TtcLHagNBFCCJmuN98cyufzyNt0Om3QoB62tjhGGSFT0b17mL+/l2HOIjc353HjsIBHdcHyoBZ9+nT28XleVXt7u40fP5jqRAghZLoGD47w9BSTt3183MeOfYPqRAih/4iJGWY479mjR5i3N146QHXB8qB2MTHDyR2pd+9wJyd7quMghJBJi4kZLhDwGAzGgAHdq199RQiZgh49OgQH+xEE4ebmNHEiXjpAL4HlQe369Ons4+Pm7u4cHY2jDhBC6CUGDerp4SH28HAeM2Yg1VkQQrWYNGmYtbVVt26hnp546QC9hBlPbLo9Pe9ehZRFo2U2zTqdumljdXri7fQCSC9o9I2LWAwWnR5kxZ/q6ezIYTf69hFCqIbNabkJFTIGnZbTNG2m9u1JBEFMS8pqio3bsRl0Gr2tSDDDS2zNMuNvLoSMuVdWeTivRKLWFCjVTfMODPanC66zmSNuPGqKrfsJeXqC6GZnNd7duSm2j5qTWTaycq12/O0nw8T2/RxtnTgsc1wfnAa0crVGotYd3W4iAAAgAElEQVTMTUj5IsintZBPdSKEkMWqUGvG334yxtUx0snW0VzbTChTa0vUmml3kzaEtPLic6lOhFBjOl1QcqqgpKONVU8Hax7DTHt2EHkKdb5SMzc++X+h/lSHQa/F/MoDHUGMiUtcFuDJMfOFwZ25bGcuO1gk+PJp1gI/t1Ab7K2LEGp8Mq1uyt2kTwK9GDQa1Vlei5jLFnPZwSLBJ4npKwK9/PGsCrIU+7IL75VLJ///+H7z5Svg+Qp4fAb9nfjkzVghmDPzO8L+PiV7sqezudcG1U3zEu/KbPz+SwghBADrU3NiPMTmXhtUN9XTeWt6PtUpEGocGTLl/QrpGDdHqoM0mhBrYaCQfzCniOog6NWZ30H2ZUmFO49DdYrGxKbTq7S6VKmC6iAIIUtDEERsSYUH36LaTAGTWaBU5yqaZAQFQs3sYaXMks54khw5rOullVSnQK/OzP4icxWqIJGAbXE7UmshLwu/6hBCjS1drgi3FlKdovEFWPEy5UqqUyDUCIpVags76Ul2BaRb0BXLFsjMjrP1APmqJhrRTyWZTq/S66lOgRCyNDoCCtUaqlM0viqtXkWY4xBrhGoq02i1FvfHTKfRUmXYJ8KMmVl5gBBCCCGEEGo6WB4ghBBCCCGEnsPyACGEEEIIIfQclgcIIYQQQgih57A8QAghhBBCCD2H5QFCCCGEEELoOSwPEEIIIYQQQs9heYAQQgghhBB6DssDhBBCCCGE0HNYHiCEEEIIIYSew/IAIYQQQggh9ByWBwghhBBCCKHnsDxoTNKKyjuXY6lOgRBC5gHbTIRMXHLCw/zMbKpToOaG5UGjKSksnD9s9JGde6gOghBCZgDbTIRM3O5136+ZMy83I4PqIKi5YXnwr9THia/zcq1ao9FoGi8OQgiZLq1Gk/40+bW2gG0mQk1JUlBQVix5nS0oZLLGi4PMCZPqAE0u+1nazi/X5aRnaLVadx/vYVPe7NKvL/lQ8oOHB7bsyHiazGKzWrUNvn/1+uT35r8xbiwA3Lzwz4k9e/MyMrlCQViP7hPmzhbZ2gLArMghrdsFObq63I29qlaq/EPaTlmy0MnVtaSoaMm4SQCQlZIa0603APxw7KC9kxPVnx4hhBqmjjYz9tSZU7//UZiTa+tgb+fklJ6UtOzH7/3btdVqtSd++e3yydPlEomdo1PEkEHDpkxiMpnYZiLUFG7+/c/hHbuK8wtYTFardkET3n3bq3VrAFArlUd3/3L9rwsVpaXuPt5qpUqn067avtnKxrqitOyPzdvuX72mlMndfH2GTn6za/++AJCRnLJi6sxBE6Lzs7JSHjxmcznhvXtOmPsOl8/b/sU318/9DQAblq0AgN7Dhsz6+AOqPzpqJpZfHvCthIV5eV7+rVUKRcbT5I0rVjvucPENCsx+lvblgiUalcovqI1MWnX/6nUGkxHWszsAnP3j4N4NG1kcjm9Qm7JiSezJ0ykPH67ZuY0n4APAg5u3+FZW4b0j8rOy4q/fLM7L//zXXRwON7R71/jrN/lCYUjXTgDA4XCp/ugIIdRgxtrMS8dP7vhyHYvF8g9pm/Ms7Wl8grufr3frVgRB/LR81d3YK1a2Nn7Bwbnp6Ye378rLzHx39afkBrHNRKhxaTUanVbr3y5YUlD4MO52ZkrqhsP72Vzuzq+/vXb2vMjG2r9dcMqDRxqNpvewIVY21tKKytWz5xbl5jm7u7l4eaYnJm1csUoureo3Yji5wbP7D4o93bv075N49/7fh4/ptLq3li31C2rz6PbdkoIC//bt7Bwd/ILaUP25UfOx/PLA3snpf6f+pNFoAHBm/8HfftgYd/Ef36DAm+cvaFSqwRPHTVrwrk6n+3TG25nJKcW5+RwO949NW7l8/tpd21y8PAiC2Lz6s+vn/r504tTgCdHkNtfu3OLs4Q4An0yfnZ709NnjxID2IZMXzY+/ftPBRTxv7SqqPzRCCL0iY23m5VNnAGD+56s7RPQol5QsGj0uPzOTRqffjb16N/aKl3/rT7f8xOHx5DLZpzNm3/jrwpBJE739W5PbxDYToUbUY9CAnoMHkrfXf7j8buzVxHv3gzqE3Th/kclmf/X7HpGt7Z3LsRuWfVKQlQ0AR3fvKcrN6zdq+PSli2k0WvaztBXTZh7YvL330CHkRpw93D7/eQeHx6ssL184fOyV02enLV3cb+SwpPj46wUFURPHh/eOoPQTo+Zm+eWBWqk8f+jI1XN/S/IKCNADQFFuPgDoCQIAaDQ6ADAYDDqDTlbkCXG3NBqNjaPDP8dOkFtQyOQA8CzxiWGb9i5i8oZ3oH960tPC3LyA9iEUfT4EAMBg0Pl8PPWIUCMw1maCngAAGp0OACwOmyAInVan0+ruXbkGAFw+//D23eQWyOsAaYlJhvIA20xKsFgsGo3qEKgJlEkkx/fsfXjrdmmRhPwvLsrLbxMWCgRBIwg6nQEADCYLADRqNQCQO6lSLt/302ZyCzyBQFpRWZSTR/4osrXl8HgAILKxcXAV52dmlxUXO4jFVH5ICyIWO1AdocEsvzz4YfmnCdfjHFzEnfr1riwri792Q6VUAEDvIYPPHzpy+vf9GckpCpk0/clTkZ1tq3bBF4+eAIDivPzT+/6ovh02h/PixtlsDgDocHQd1bRanUKhojoFQpbAWJs5IHp0yqPHmz5d3aZDWPazNJ1W165LJy6fV15SAgBP4xOexidU3w6LzX5x49hmNie1Wk2eAkOWRFZVtXLGnDKJxLdNYHCHDs+eJGUmp6jkSg6P12fYkIvHTiyf8pZXQOsnd+MBILx3LwAok5QAADmQoDo2l61SKWvcyXq+k2qb8TNZuIKC1xogTgkLLw8qSssSrsfZOTp+/dvPHB7vacKD+Gs3CIIAALGnxxvjo4/t/iXl4SM6nRYYFjpp4bsCKyu+UAAAXSP7zVu78hXeUa/XN8HnQC9BEAQNz5Ih9NrKJSXG2sxuA/pf/PNE0v34x3fustjs7m9ExiycBwBkmzn9gyX9Rw1/hXfENrPpEATQsTqwOLcvxZZJJOG9ey36ai0A/Ln7l8zkFHInHTP7rduXY8tKSipvlolsbd+YMHZIzARyJ60sVX+z71dXb88aW6uqqHzpOxIE7qSvjvyvMTsWXh5o1CoAsLa3I6+apTx4BAA6nR4ACnJyj//8q09gwKfbNrFYLMNLAju0B4C7V649S3xCDsRJf5rs6ulBbqEOXIEAAEoKitRKJZvL1Wg01TeLEEKmT228zbxy+mzS/fiBY0dPWbKw+ksCQ0Nv/v3PuT8OdeoTQc7wlpzw0L99u5e+F7aZTQ3Pm1gkpVwOAE6uLuSPKQ8fAYBerwOA/Ru3VJVXLPxyTac+vau/pE1Y+7gLl47u/nn2io9YLJZWo8lMSa3PUGOeQAAAeZnZAIB76Ksx093QwssDG3t7K1ub9KSnn7+7kMlkPrp9FwAKs7IJgiiXSAiCkFZWnvz198KcXBrQWrUL7j10sJu3d8TgQVfOnF09a65n61ZarTYvPWPi/LmGccnGWNvZOrm5FuXmLR0/mWclHDRuTJ/hQ5vrgyKEUCOwdXQ01maWFBYBQE5G5oHN24ryC/h8fodePUK7d4uIeuP8oSO5GZmLx0509/GpLCsvystb+/N2nwD/ut8L20yEXkFASAgA/HXocGFubmlRcXrSUwDIz8oCgJLiIgB4ePP204SH5RKJjYND76FRHn6+o2ZMi79+88ZfFxLv3ndydSnMzqExGOsP76+113R1rdu2vXDk2OHtu+5cjlWr1F//9nNzfUpEMQu/7shksd77+nO/oDapj58U5uS9tez97m9EymXynGdprdoGd+rTu0xScnj7rqtnzl05c3b3N98d3rELAGYu/yB6zkxHV5es1Gcl+YWBHUK9WvnV5+3eXfOpl3/rirLSsuJiobV1038+9JyZVucImRqW8TYzIuoN7wD/pHv3j//y283zFy4eO/HtkmUJN29xeLzlm3/sO2Iom8tNe5KkVCq6RvYTWAnr83bYZjYpbBgtkk+bgFnLP7R3dn5wIw5osHT9Olcvr7QnSRqNZsikifZOThePnTi7/+DNv/85u//gmtnzKsvK3H19PtmyMbR7V7VClfYkicsX9HhjIFGPfn3d34gcGD2GLxTkpKYJRaJm+XyWxkx3Q1qtnaIeP94KUB4cPI6KSHXJVqg+Tkx/18e1Ubb2NOGBnZOjo4sLANy5fGXDshUBoe0/2fxjo2y8QU4UlPS0tx7sbNf8b20Zzp27mpycMX9+DNVBanHwYHR09B2qU6BGEBe3XCz29/LqRXWQBkiRyr9Kzp7l7dIoW7sbe7VNxzC+QKDX63//6X9n9x8cOWPq2FkzGmXjDXI4TxIltuvrYNP8b21Gfv31OJvNHD8+iuogNUmlBbGxX0VFHXul12bHxs6NiqLgm7qJfJeSzWXQO9pYvf6m1Erlg7jbYT27MxgMtVL5+buLniU++fin9UHhHRojaQMo9fr1qTnHurZt5vc1QTqdbvbslTt3fkZ1kFrExW0Ui/t7edXSRFh456I6PEt88tXC9zUqlZ2jI5PNkuQXAEBI185U50KvQqXSlJZWUJ0CIUv218HDv3z/I4PJsHdy1qjVZRIJjUZr1zmc6lzIqMpKKY+HMz63IJtXf3H70mUOj2vr6FBZViGvqrKytfEKaE11rhaNIIhHj1KpTtFgLbc8cPF0Hxoz8fbFS4U5uSwOxy+4Tb+RwyKiBlOdC70KM714h5AZCenSue+IoQ9u3i4pKhSIRB179Rw8cRwuX2DKsGFsaSJHj9DrdckPHkryC6ztHbr07zN88iSBVSNcl0CvzEx3w5ZbHvCFVmNmTh8zczrVQVAjMNPdDyEzIvb0eGvZUqpToAbQ6wk6HRvGFiS4U8fgTh2pToH+w0x3QwsfmoxaCCwPEELoBQQANowIUclMj0+wPECWwEx3P4QQajoEAdguIoReAZYHyBJwOGwHB1uqUyCEkAkRiYRCIZ/qFAi1aAShDwt7+Qp0pgbLA2QJpFJ5RUUV1SkQQsiElJSUq1RqqlMg1KJptfpHj1KoTtFgWB4gS6DX6+l0/GNGCKF/YcOIEOV0Oj2DYX67ofklRuhFZrr7IYRQ09FqdUwmg+oUCLVoOp2OwTC/3RCPqJAlMNPdDyGEmg42jAhRzkx3QywPkCXAk2QIIVQDXlZFiHJmuhu23GXRkCXh87k8HofqFAghZELs7Kz5fB7VKRBq0fR6fVCQH9UpGszMChqCIBxYLKpTND4+nc7A2alfQ3FxmUajpToFQiZHT4A92wJPAwkYdLxc+FI5OQVUR0Avx2fQmRa3PgUdaK5cNtUpTIJCocrMzKM6RYOZWXngxuMkVsmoTtH4spUqMQd3pFenUqk5+AtE6AXuPM7TKgXVKRpftkIlxoOPl1Eq1Vz8LZk8OzarSKWhOkUjK1apCYLqEKbBTI9PzKw8YNBo4TbCEovbkTg0mif2jXkNZrr7IdTUBEyGv5BXaXHX1rh0mhe2mS+DDaNZ8OVzdRZ3KF2q1oTaCKhOYRLMdDc0s/IAAGI8nY/kS6hO0ZhO5ku62VvbsC2w01SzUas1bPwFIlSbiR5OR/Isqs08klf8hpMdxwwnA2lm2DCahU52Ij1B3C+3qJU9D+ZJ5vi4UZ3CJJjpbmh+5UEbK8E8H7dt6XlVGh3VWV6XSqc/micJsOJPcHeiOot5w2voCBnTwcZqiqfzzox8ubaWNpMwq9OWKp3+YG5RZxvRcFcHqrOYATM9bdkCrQnyyZCrbpZUWMBlhHyl6tuU7AOdgqgOYirMdDc0yyFrYbbCRa3cf88uSpYpgkV8iar26+YajYZGozGZzf0Z616osqpKZmUl4NFp+Sq1FZM53MV+qNi+eQNaIHt7a4GAT3UKhExUd3trNp1+IKcoTa4MEPJK///cilan02l1zf/VpVKpOJzauwbp9DqtppZIAiY9R65y4LCGi+0HOts1S0yz5+EhxindzMXXbX23pOV99jTTncVUa7V8XnNPOVXH9PwEQahUai73JX9LjhzmvTJpVzvRtjB/G0ucEeHV6PWEp6cL1SkazFz//0KshSHWwnKNNkuuqrXWvnTpVn5+8cSJQ5ozVUlJ2Tff7OJy2T17dhgwoEetz9m163BERMfWrb2cOWxHDothcfMVUCIlJXPgwNp/4QghAAi3tQq3tSpVa3IUKgJoAKBWq1es+OGbb5Y2c5Lvv/+5oEAycmS/7t071PqE+PgnN/658c47E/9zL40Qs9kO2GY2RHx8Ek5sakbm+LpGW/OnfvDd2m+X0pt3aq7vv/+5tLRizZp5xt735s34JzfTpk8fXcdGWHT42N+La4Zz/DepsrIKmcz85ocw1/KAZMNi2ljX8hEqKqqGBPu2GtG3uQNZC/4nKc3IyD16+8GtA2enTBk+dGifGk/54b0p16/fb28tbO5sFk0mUwgE+C2I0EvYsVl2bBYASKXytPyic1tXNX8G6/9r7z4Dmrr+PoCf7BAg7A2yQdyKe++Be9tqrVpH3Vbr+KvVVuterbhHq7XuiRtx4UJUEFH2nmGGFbLH8yJ50CIiI8m5F36fN6VJ7r3fgDm5v3vPKCr+8Ob9+biU/L7v162b9/kLWvdqP7ptU6FQaGtrpf94DQk0jORSXFxaUFB08/hG/R9a8CGel5a9efa6c+d2VXkPofWgbrHOdsYCgYODjf7jkRpJP4YNsMgTiyU8Xr6HhzOWo5uYGCmVSolElpSUvmPH31On/u/Zs/BKr+natS1CaMWKnVgSNkgk/fgBgMXr1++TkzNatfLGcnSlUqVSqQSC8ps3gydNWhYVlfj5a7hcI5lM/uFDAo6ADYRSqZRIpAYGbNxBQI0sXrwFIYq3t6v+D52bWyiVyhBCKSlZEyYs/dLLmjZ143KNFArSD/vUM4FAaGREvs7PDa08kMvlo0YtatrUDVcAe3trCoWCEKJQKOXlwujoxN9+279w4e+fv3LkyL5nz97CkbEBKi8XwtgDAGqivFx0/PgVXLUBQohKpaobSaVSmZCQtmLFzn//vfH5y5yc7D58SNy37wyOjA0BXDQhC7lcERj4bPz4gaamxlgC8Hh5EolE/XNaWvbYsYureXH1z4LPkfST2NDKg8TEjBs39mEM4ObmWOkRoVDs77/281d269Zu4MBuOTn5AoFQX+kaLJJW5wDoWVxcSlmZ4NCh9Rgz0Okfuy5QKJTc3MIjRy4sWFDFNZRJk4bMmjVOfV0T1Ba0iqTw4sXbtLTsXr06dO/uiytDbm6hQPCxc3xaWvb48UuqfKWxseGhQ78+f/5Wj+lIj6SfxAZVHjx//tbGxpzBwDm/rKOj7adDwezsLJ8/P/2lF1tYmNrYWA4d+qNA0ACXgtYbkUjs6Gj76TkHAOBzgYHP0tN52Dv0M/87pYlKpeJwDPbtq+IaCkKIxWI+evQqJCRCX+kaDoFASMb5UhqV1NSss2dvu7s7fXVSIJ3KysqrVISnpGQNH17FuCCEkK2tpa9vM7FYoq90pKdSqUxM8NwXqo+GUx5s3HgwP59vZmaCN4aVlZm6TKTT6WZm3OqH+asvngUH/xMREauvgA0Qn18ia3CLwgKgdZGR8QMGdMWdAsnlSvUPSqWSxWKGhV26e/dINa8fNKjb27cxd+481VfABoLPL0YIZnkiLrlcLpcr/P3X4A6CEhPTK9Y/USqVZmZcFxf7GzcOfOn1bDZr4cJN4eHResxIYhkZPCgPsMnJyf/++1GjRvXDHQS1aeNDp1OtrMxfvjwbFHT85MlrmZm5X91KfVfx6tX7esnY0BQVlZqZcXGnAIDoli+fgTsCQggplQqVSmVhYRoeftnW1jI1Neurm8yb982QIT30kq7h4PNLzc2hYSSoadNWU6lUD48muIMghFBmJg8hZGZmYm9v1bp106Cg45cu/Vn9JkePbrhyJQiGKdcEST+JDaE8EApFcrmCOHdRHRxs7tw5rP55584V27Ydq+GGcXEp79/H6zJaw8Tnl5ibY75rBABh3bjxaPv247hTfLR796rw8MuBgUcRQgsXTq75ZZFz527HxqboOF3DAQ0jYf3zT8CyZdOqWT5Vz1gsVljYpaCgY9evH7C0NM3K+vo1TYTQ778v/tIyauBTRUUl2Du21AFR/nXWmVgsGTBgpqOjLe4gHx069HEqcU9PZw8Pp9Onq5iX43OrVs2i02kwDqG2SPrZA0APUlOzbGwsV6z4AXeQqvXu3fHBg5c8Xn5NXjxpkt+9e89httMagvKAgLKychUKxYQJg1u29MKd5aOjRzdU/GxlZf75bOxfcu/ec3//L46uBAghqVQmlcpgaDIGoaGRFZfqiWnx4qmHD1+o4Zp5Pj7uDAbjhx+qHqUHqiSVyl1c7HGnAIBwpFKZhYVpx44tcQepzowZY2p+A2HRoiktWnjqOFEDQaNR7OxgXTkCKSgomjt3A41GwzsQuXo9e3ZISsqo4YsHDuxmZ2f58uU7HYciseLiss6dW+NOURekLw969erA5RJ9BeJly6bv2vV3DV/MYjEXLpwC4xBqLiEhFRY9AKCS6OikH35Ya2xsiDvIVwwZ0uPMmdqtALNhw4Hk5JqewTRa798nwG1VQnn58t316/txp/gKHx/X+/dDav76ceMGkfT0Vz+ys3NLS0nZJYTE5cGbNx8WLdqMO0WNjBzZNzY2uSZjlNXatGk6cGA3hFBOToGOozUEPF6BnZ0l7hQAEIhIJI6PTz11ahvuIF9nYMAeOrTX48evar7JunXzzp27k5mZo8tcpAcNI3Hw+SWRkXHDhvXGHeTrTEyMfX2b1fbDtWrVbqVSqbNQJEbejyGJy4Pz5+/s3r0Cd4qamjBhyN9/X6n569Vr7M2Zsz4/n6/LXA1Bdnaevb017hQAEAiTySDCTG411KqV18OHobXaZPXq2YQackZAPF6enR00jPiFhr5bu3YvxnXKa0sikaal8Wq1yYIF365atUdniUiMvOcnJC4PduxYTqfTa/BCQhg1ql9ubmFpqaBWWwUE7I+MjIP1R6qXk1Nga0vK6hwAXZg9ez25llLp1as9n19Shw2nTFmZk1OjYc2NjXoVIFgsEjuJROrh4XzgwC+4g9RCy5ZeeXmFtdrE0dF2+/ZlOktEYjxePkmHAJGyPEhNzTp16jruFLVmb28VFFSLLn1q/fp1kUikJ08G6CYU6RUVlXh5ORN5pBcA+hQc/Hr27PG+vs1xB6kFIyPDjIycOnQW+vffbc+fRwiFNZr4oVHJzS1s1YpAc+M0TsXFpc+ehVtYmOIOUjtyuaKwsLgOG16//igw8JkOEpGYRCIl6X1OUpYHP/20tVevDrhT1NrgwT2Cg1/XYUMTE+OSklKYzq9KycmZTCYDdwoAiKJXrw7t27fAnaLWmjZ1rduaBmPHDuDzS+RyWJ7pPxIT04k/Kr1hKykpGzNmcb9+nXEHqTUOhy0Uiuuw4YgRfRIS0iIiYnQQiqzevIlydibKqly1Qr7yQCgUnTq1lTiLoNVcu3bNQkIi6jZ8Z9Gi74yMOHW7/96wJSSkeXg4404BACFMnLgUd4Q6cnKyy87Oq9u2jo62w4bNhWFan4qPT/XycsGdolFTKJQPHvyFO0VdMBgMFquOF90WLJjcpo2PthORVUlJmUQitba2wB2kLshXHggEQiMjsl4UadbMPTo6qW7burg4cDjsyZOXazsUucXHp3l5QXkAANq+/fiiRd/hTlFHxsac4uKyOm9+9+6RzMxcGKZVITEx3cOjCe4UjVdo6DuEEIVCwR2kLvj8YiaTWZ89LFmyRaGAG3ooMTHd05OsH0OSlQdnz9765x/yjTqo0L27b3JyZp03Z7NZv/wy786dp1oNRW4JCamennCRDAC0YsUP3bq1xZ2ijqyszGi0en0ftW3rk57OKyqCW6wI7h7gdfz45bCwaPIuWU2hUCws6hV+/fp5CxeSY955nYqPJ/H5CcnKg7Cw6Jkzx+JOUXdSqbSgoKg+e2ja1LVfv87FxWUZGbWbd6yhYrPZnp5w9wA0do8fvyL1tfPCwhKZTF7PnXh5uYwfv1Qmk2kpFFnl5RW2adMUxh5gUVoqaNHCY968b3AHqbuIiNh6jqY1MzMh12RNOsLnl7RsSdZV3klWHuzcudzUlIs7Rd0ZGLBForqM+PkUk8kwNTVeuHAzLJr24UOCVCplMEgzvy0AunD9+qPg4Neknr+rqKjUzEwLbfv9+8djYpKl0kZdIYSHR8NC8rhwuUadOpF7FeGYmGQfH7f67+fDh4RLlwK1kYis7t8P8fFxx52ijshUHoSFRfF45J7i2syMq62v8GvX/N+/j9fKrsgrLCza17cZ7hQAYEajUZcvn4E7Rb3QaDRtrV7SqpX3+/fxdR7o3AC8eRPVvj2ZZrZtMK5cCdq582/cKeolPT27W7e2Bgbs+u+qRQtPCoVy8OA5beQin/x8vkgkIeM8OmpkKg8WL95iamqMO0W9REUlarE/4oABXRFCp0413iURwsKiyDW/OwC6MHRoLw7HAHeKegkMfNaihdbuwvv6Np8z59dGew8hPDy6XTu4boLBs2fh8+eTuFsRQujOnafu7lobTTt27MC5cydpa2/kQvaPIWnKg5ycgtWrZ2mlosUoLS3b2dleu/tUKFQPHtR6tbWGAcoDAI4fv/zsWRjuFPWSk5OvUCgdHGy0uM8bNw6kp/MEAqEW90kKBQVF5eUirX/RgJrYvXsl2c9SAgOfDxrUTbv7fPz4VSNcuInsN/FIUx7Y2lr6+fXCnUILtD7Z3LRpo2xstHNTnlwiI+N9fZuTur81APV38WKgt7cr7hT1Eh4e7efXQ+u79fBokpyc8e5dnNb3TGRv3rzv3Zt8y4Y2AM+fh5eWCnCnqJfo6KTmzT20Xlv27t3xwoW7je06ZkxMcufOJB6FQpryICwsKjY2GXeKegkOfnvRDHUAACAASURBVG1sbMjlGml9z+qb8gsXbtL6nons4cOXHTqQb3VYALRILJb4+6+xsjLHHaRejh+/MnSoTq7+tGrl/eef/zSqewhBQS+7dGmDO0WjU1JS9ssv/rr4ftcnf/9/hw/vo4s9b9iwsF+/LrrYMzHFx6cqFAp7e2vcQeqONOXBjRuPExLScKeol9u3n/j59dTd/lesmLF69R7d7Z9oHj9+DRfJQCPHZrPIPrFvSEiEnZ2Vi4uDjvb/11+b8vP5jWfJ+eDg1717d8SdotHh8fIXLZqCO0W9hIVFKZXKjh1b6u4QV64ENZI52RvA+QlpyoOOHVs2b+6BO0XdlZWV5+QU9O+vw+rZycnul1/mqjvy6u4oBJGUlMFk0p2cyDonAABaERT04sSJq7hT1MvNm491vZqNq6tjfj7/1q1gnR6FCIKDX/fs2R53isaoaVO3UaP64U5RLxcu3F269HudHmLMmAH+/qcbQ3+/x49fkb1KJ0154OfX083NCXeKutuz558xY/rr+ijqQVEHD56Pi0vR9bHwev48fNiw3rhTAIAZj5dfUkLi7s537jylUCht2vjo+kDe3q6hoZENfk3l0NDIIUO6407RGD16FErqDg4nTwY4OFh7e2thuYPqbd/+c+vW3vVcH5bgMjNzuVwjsg8JI0158OTJG/J+9tLTeW/fRo8cqadLC7/9tqDBXyc7f/7OwIFanl0BANIZNarftGmjcKeou02bDq9ZM0c/x9qwYaFEIktJydTP4fRPJpNduXJ/wABoGDG4c+dZeno27hR1xOPlX7x4d9Gi7/R2xKioxNDQd3o7nJ5du3a/U6dWuFPUF2nKg5iYpEePXuFOUUcHDpxdvXq2Po+4dOk0hFBDLRJCQyOdne21tYgSAOTF5RqZmJB1NZi9e08tXz5DnxNB2tpaqlSq3btP6u2I+nT9+qMRI3QyrhR8Vffu7cjbwWHLliN79qzS5xF79epw8uR1pVKpz4PqTcP4JJKmPOjZs727uyPuFHVx4cJdMzNuhw46HO7zJVKp7OrV+/o/rq4FBb0gey9PALTi1av3f/zxD+4UdfHoUWh6es7IkX31fFw3NycbG3OyT0BZpYcPQ0eOJP1JCUmNGNHH1ZWUpyhbthzt2bO9/mc4OHDgF6VSGR+fqufj6lpISES7ds0sLExxB6kv0pQHPj7uZJwVKz+f/9dfV1aunInl6KNH97e3t8JyaN0pLRU8ePASehYBgBAyNzcJCYnAnaLWxGLJ2rV7d+5cjuXokycPZzIZQUEvsBxdR6KiEsvKyps319rK06BWXr9+HxWViDtFrd2+/UQoFI0bNwjL0el0elxcSnDwayxH15GzZ28PH94QBkaSpjxACO3ff0Ymk+FOUTsLFvx++PCvGAN06tQaIfTjj79hzKBdp0/fnDx5GO4UABCCh0eTXbtW4E5Ra3Pm/Pr33zjXaWGzWU2buv7001aMGbQLGka88vL458/fwZ2idpKTM86evbVx4yKMGYYP70PGsupL0tKyMzNzunVrhzuIFpCpPEhLy37yJAx3ilpYs+bP6dNHE2Fx+59/nka6lutL4FsQgE85OtrijlA7q1f/8e23Q728XPDGcHKy+/HHiXgzaEtBQVFYWNSgQTBnETZdurRp164Z7hS1IBZLvvtu1alT23AHQfPmfdNghkqePn1j8uThuFNoB5nKgzlzJhoZcXCnqKlLlwKbNLEdPLgH7iAIIeTh4azu45uTU4A7S71cv/5w4sQh+hzLCADBXboUePJkAO4UNXXmzE1PT2eCnMiqZx7cvfukWCzBnaVeLl++N2vWONwpGjVzcxNyjYibP39jQIA/7hQfyWTyixcDcaeoF6FQFB2dNHbsANxBtINM5YG7uxNZ5op6+DA0NDRyzhwCXZpis1kIocWLt2Rn5+HOUnd//vnvd981kNIcAK3o2LHVtWvkmIEgIOBhYmL69OmjcQf5j0WLJo8atRB3iroTCMrPnLmNq/s4qHD/fkhsLDlWHJo0admqVbMsLc1xB/lo1Kh+XK4R7hT1cvDgOT+/XrhTaA2ZygOE0KlT1z98SMCd4isiI+NOnbq+YweeUXfVO39+15Mnb0g6m9j583cGDepmasrFHQQAAmnSxO7s2Z3E/1A/fvzqyZM369bNwx2kMjqdfvfuEYRQRgYPd5a6OHjw/Ny5BLoU1WhxuUZ//kmCacRmzVq3YsUP+p+q6KsGDeqGENq791/cQeqitFRw82bwt98OxR1Ea0hWHnh7u+7ffwZ3iupkZ+eePHkN76i76k2a5KdUKk+fvok7SK1dvBi4YMG3uFMAQEQiEaG7x7x7Fxsc/Jrgo6iDgkJINw0Un1/y9m3MpEl+uIMA1LFjy9mzJxB8ApVt247NmDGGyMMkevXqcOzYJdwpau2vv64sWaK/deX0gGTlQceOLX/+eYZUStCPH4+XP3Pmul27VuIO8hV0Oj03t+DZMzKN8z506PygQd04HAPcQQAgHDabNWDADxKJFHeQqr19G7N37+n16+fjDvIVM2aMefqUTK0iQmjXrhNTp47AnQJotG3rw2AwcKf4oiVLtnTt2rZLlza4g1SndWvvIUMIMWiz5tLTecHBr0eOJNPgk68iWXmgHoHAZGo+fr16TV29+g/ciTRSUjI3bDhw+/Zh3EFqZOnSaSQa4CsQCN++jZk1azzuIAAQ1MqVP1RM5N+tG4Fust26FXznzpPjxzfiDlIjK1b8gBB6/jwcd5AaiYtLUSqVBJkAAyCElErl7Nnr1T+PGDFv2LC5uBN9tHjx5hkzxvTo4Ys7yNc5ONgghObN2yAUinBnqZHTp2+qm46GhHzlAUJo2bJtvXt/7+s7TiAQsliEqNTT0rLnzdt48OB63EFqwde3uXqwcsUjffpMnzhxKdZQVdu8+ciYMf1xpwCAuEaO7DdsWO+ePb/z9R0nFkv/+usK7kQIIfTixdurV++vXj0Hd5DaodPpBw6crfjfbt2+XbDgd6yJqrZp0+EpU2CqBgKhUql9+3bq1296+/bjs7Pz6XQ67kQas2atW7JkaqtW3riD1IK//5rff/94vXXs2CUDBswID4/GGqoKz5+H83j5BL8nUwdE+bdbcz16TBEKxRQKhUKhKJVKuVyBOxGKjk5au/bPO3fIcd+gkunTRz958qZnz/ZDhswuKxPQ6dQXLyK6diXQP/S3b6OVSiVBZkIEgIBGjZpfUFAsFksRQhQKRaVSKRT4G8aHD0OvXr1/7Bg57ht8qlOnVhU9yAcM+EEikaWl8ZKSMtzdnXBH++jWreB27Zo1b+6BOwj4aNCgWYWFxRX/K5XKysuFhoaYJ2RfvfqPuXMnubo64o1RWzQabfPmJQih0NDITp1apaZmUiiUGzceEW3gxPHjl7dtW4Y7hfaR6e6Bn9+PbduOEYkkFAql4kEaDfNbePXq/ZYtR65c2Ys3Rp21adPU17fZ4MGz8/OLEEKFhcU3bjzCHeo/1q/fv3DhZNwpACAoP785mZl56tpATaVS0emYG8bLl4OePQv391+DN0adde/uixDq3n1yUVEpQig3tyAw8BnuUB8plcpff92/ZMlU3EGAxoIFG7t0+fbT2gAhpFIpsd9AGDFi/rRpo4h2Sl0rAQEPO3SYoD7xe/cujs8vwZ3oo4MHz3Xt2tbKikBTxGoLmcqD27cPtW7tXTHwQI1Op+FLhAIDn1++fI8I6w7Wx5QpKwsKitQ/U6nUqKhEHo8oq6edPXtr4sQh6p6IAIDPbdmy1NXVkUr92JhTKBQWi4Ux0okTV+PiktetI1DH6zoYPXphRdGlVCrv3XuOO9FH+/ad2br1J9wpwEf79v0ybtwAE5P/zNzPZrMQUuGKpFAoFi3adPDgOuwrlNfT48evVCrNrzE7O+/+/RDciTSysnKTkzNmzmyYKxKSqTxACJ04scXPr6exsaH6f6lUKsZZAi5cuBsc/IrsN5VGj16YkZHz6SO5uYX37xPiizA7O+/MmVuTJw/DHQQA4mrd2vvs2R3du7f7dFF5jOXBnj0ny8tFpBtvUMnEiUsrNYzFxWWPH7/Cl+ij0NDI2Njkfv264A4C/mPZsulLl35vb29d8QiFgu0Uhc8v6dLl202blpD94lrPnlM/naxSJpMT5z7eunX+kyc32ME/JCsPEEJr1/44d+5EW1tL9T10XDGOHLmYkpK5eTPpr9/IZHIDA5ZKpar4ZSoUirt3CfHxW79+32+/LcCdAgCio9Ppu3ev/OYbP/U9bpVKpVTiGXuwfv0+Kyvz+fMJNHVS3UgkMnXDWPFIaakgIOAh1lAa0DAS1tChvf39V3t7u6h7wkgkkk9v6+lNcnLGxIlLX706X3EtlaRGjZqvVKo+PT+hUCgZGTkREbG4o6GrV++7ujq2adMUdxBdIV95gBCaMGHIjh3LnZxs1QOU9R9g+/bjKpVy5cqZ+j+01t28eXDz5p/Gjx/s7u5ka2tJp9NUKlVeHv/Vq0i8wQICHjo52ZK6xyQA+jRnzsTVq2c5OtqoL1vqP8CSJVs6dGjRMObSuXbNf/36+SNG9HV1dbS2tqBQKFQqNT4+LT09G2+wfftOT5w4pEH2dW4YnJ0dTp/eMWRId0NDAyznJ2/fxqxcuTso6Lj+D611167t379/zezZ4319m1tbm7PZTKVSWVhYfPPmY7zBZDLZtm3H1679EW8MnaJUeQE+KuowQsXNm0/QZ5RzmXlxAmGRrBYXvRIT0rhcI2sbC13mqqxcIBSJJT52Vq4c9jhHKyaOawO1dSkrL6ZMVCaTV79skkQsKReKSkvKRSIxl2vo4Girt4SVqVRJSRnuHk20uEs7NsOQSmtratTNwkSLu/2qixfHjx//Rp9HBDoSGrrG1tbL2bmn3o5YKJHdzCnMFktza7wWpFwmT0xMb+rjpuNolRUWFBkw6E0szXyMOWMdrPR89DoQKhRXsgrShOJ8mbz6V4qE4vJyUVmZQCyW+jRz11fAKsikMl5OQZMmdlrcpz2LacKgdTHntvpvv3mdEghynjzZ6ucXUKdtM548mefnp9fpQIJy+R/KhGKFsuafxLzcwpKSMk899/tXqRKT0ns29zCm09qbGXcw4+r16HXyKL8osqRcpFTmSKr73cplcqFQVFYmLC8XqlTIu6mrHjNWlptTYGhoYKS9mzNGVCqHTmtmzBllb6mtfdZEaOg+W9t+zs5VLLtOiIlNkwSiue8Selua2LNZXka1GGrcqWMLXeaqWimNwnWyEcmVKULRqJdRu1q4+XCJe/8uSySe8zahu4WJDYvhachWfeVihrHmvyoVwnHZ41OdzJtrd4cUhHhi6cP84gf5xeuaOmt35wBo3Ut+iX9SVjOuoaMBy8WwFosYdrM202WuqgkYNGNDTolcniAQjgz5cLSdlzWLqf8YNRRZIlgTndLJjGvPZrkZfW0tdjPjr7xAn2y0f98gUyg5m5kXXiyY5ozvkhBRKVSqnyKTHAwYBjSauxHbRVXjTyKOfzYqpbJThxYUhLJEkpu8wheFpYs9iDufqUqlWhmVbMqgG9JoboYGzpyv/m4xtGxV0/Yfl4pQkVQeVyac8Cr6SBsvUyb+k3P8CWJKhXuTMjf4kGdk/f//C27G5fSzNt+fkv2Ds21bUyJ9f/y/NKF4U2zaCi8nUtzi0AMXDhshFJxfvCUu/X/e2rw1AYB2Pc4vvsYrWOhO3K/2yj75au9sbrImKuVXHxcHA5wTKH3JK37pqYzcNd5wjUDDlcNGCN3KKTyRlgMVwqdUKtWid4ntzYx9jDGvXVBb6i+7e7n8A8lZ89wccMep2oqoZFcOux0hT5/0z5mDEEJtTY2Wvk/c3dLdlIl5zV/MZ40KlWpPUsYER+savJagJjla/5mYKVdiGyRdjZ0JGWMdyNH9SZ96WZmqkOomrxB3EACqViyVncrInUTahpFNo46yt9yZkIE7SBVkSqV/ctZkJ3JP56ILQ20tYsrKw4rKcAchkKOpvOZcDulqgwoDbcx5YumTguIavFbf/k3PtWexoDaohMugD7U1306AxhPzieO7EgGDQjXAvbRZfdApFDMmI7SoFHeQyuIFQqFCacLAf4OIgNwMDYLyi3CnAKBqz/ildmzi9sypCQsmgy+Tp5SLcAep7HlhqS3Jf7e648xhP4CG8RP38oq8jMhaG6i5ctj384j4Nw3KL/L8ar++RsnBgJ1ULs6TVD9WVOcwn5dniKRNOES8+1wrTTjsVKEYd4rK0oRi16/35GukHNhMsVKJOwUAVcsWSRzZpG8YPY3YyeWEaxgzxGJHQnZ5IgIHNqvoawO1G49CicyWxTTCuvRq/TkYsMrkhPuyEysULCrFkoW5/wxhNTXmJOFuPDGXB6VyuYKIvXJqrZh4TapArpTjW6+R4BhUagruzx4AX1IgleGeGkALpEqVQIFn+YVqFEkJ11YTB51KyRRLcKcgColKyZfXdJ4iwmJQKWkiwt3Ek6pU+TWeA6oRkigUQtyNJ4l79QAAAAAAAAC0C8oDAAAAAAAAgAaUBwAAAAAAAAANKA8AAAAAAAAAGlAeAAAAAAAAADSgPAAAAAAAAABoQHkAAAAAAAAA0IDyAAAAAAAAAKAB5QEAAAAAAABAA8oDAAAAAAAAgAaUBwAAAAAAAACNxlseSCWSu+cv/rN7L+4gQEOpVH54/ebm6bO4gwDQeBXk5Fw+9ve9S1dxBwEawvLylw8evbh3H3cQoFcvHzw6vGGLoKQUdxCgkZaQcOvfM6VFRbiD6Akpy4P4d+95aRn13ElZScm/f+x7/+qVlkKB+hKVl29dtCzw3CXcQQAgH5lMFvrwkVQiqed+Yt9GXj1+IiMpSUu5QH1Fvwnft/bXiBchuIOAGsnNyIwJf1v//Vw4eOTpnbtymUwboYAWHN649ez+wyJBOe4gekK+8uDvHbs3/LggKzUVdxAAACCKNd/94L/mV5lUijsIAI3Xy/sPl02Y/Cb4Ge4gANQX+coDUXljKd0AAKCGoGEEADv4GIIGg447QO0c3bz9ReB9hNAfq9YihHoNHzpr9QqEUMSLkMvH/s5MTGYasFt27PDNwrkW1tbqTbJS087tPxQTHqFUKtyb+YyfPdOrdcvP9xwd9vbc/kOZKSkcI6MWHXxnrFjGZLP1/v7ISi6X3/jndPDN28UFBeZW1j2GDh4+dTKdTkcIzeo/1LNlMyt7u7Anz6RiiVerFlOXLba2t1dvKBSUnT949NWjYLFQ5Ozhjvt9AEBKS8d9U1RQgBCaM3AYQujHdWu6DxlYfcNYzVOfun/52p1zFwrz8i2srXoO8xv5/RS9vzkS+1LDmBqfsPb7mYMnjeelpydERjHZrPa9uk+aN5fNMVBvmJWadnbfwZjwtzQa3creDvf7ADUS/+798a07EUKBFy4FXrhk7WC/+9LZiq+514+fiMoENo4Og7+Z0Hv4UPUm1Tz1qXwe79Qe/5jwdxQqxa2p13dLFzu4OOv9/ZFYSlz8hYNH4iPfUyhUr1Ytxv84y9XbCyG0Z+WahA9Rw6Z88+BKQHFhob2L86S5c5p38FVvpVKpbp059/Dq9aK8fBtHB35BIe73oVcku3vg3szHwtYWIeTVumXn/n3cm/kghN4EP9n18//S4hM9W7Xgmpm+vP9w448LhYIyhFB+Nu+32fPfPnth4+jg7OkREx6xeeGSpOiYSrsVCsp2LV+VHBPr066NvXOT1Nh4qA1qTqVS+a/59fLRvyRisXvz5sJyweWjfx3euLniBZEvX4UEPWzVuZODm0vEi5e7lq2Sy+Xq3tJbFi17cCVAJpE2cXfLTkvH+j4AIKu23boyWCyEUPtePTv372Nlb1t9w1jNU5/68PrNiZ17Svj8Nl06szmcwtxcTO+PlL7aMN49dzE3M6tTv94sNvv+5YDTe/epH8/JzPpt9vyI5yFsDseuiVNmcjK+NwFqwdjU1NXHGyFk28Sxc/8+bbt1RQjJZbKti35+cCWAwWB6tm6Vm807tnn73fMXq3+qkoO/bQp/+ty2iYNXy+YpcfEGhhwc74+sEj5EbZiz4H3oa3sXZ1snx8iXrzb+uCAtIUH9bCm/6Ny+Qy7enq06dUqNjd+5bGVedrb6qVN7/M/tO1SQk2vv6iISioRllVvIho1kdw/6jhoeGxHxIifH75uJ7Xv1UD94eu8BlUo1/9e1nfv3VSgUu35eFfny1YMrN4ZP/fbK8RPCsrK+o0fMWLEMIRRw8tTFQ8cuH/1rxZ4dn+42LztHIhJb29sv37UdISQWijC9P1IKe/Is7MlTZy/PdYf8WQYGwvLydTNmh9x7MHTyNy5enurXbDx+yMbJESH0y/TZKbFxSVHR3q1bPQ64mRIT5+juttp/N9fMrDA3d/GoCbjfDQDk891PC189fFwkkcxas8LQ2Fj9YDUNYzVPfbrbjKQUhFDHPr1nr10FDWNtVdMwql9g4+Sw6cQxloFBaXHx4hHjnt6+O235UhqNduHgEWFZWbfBA2atXklnMJ7eDvy0qACEZefs1Hfk8OMxca07d/7up4XqB0OC7ifHxDp7ea4/vI/JZsdHvt8wZ8GVYyf6jR758v6DLz3FYDI/3XNGYgpCaPHmDZa2tmKhqOIuE6iJE9t3yySS+RvWdRnQDyH08NqNv7btvHLsxE/bNqlfMH3F0j4jhyOEzuw9cPvs+ReB90dNn5qZknLv4mUGi7XukL9rU2+FQrHy26k56Zm4343+kOzuwedyMzLzs3lcU5NO/foghGg0Wg+/wQih2HcRCKEPr94ghAaOG6N+ca+hfgih2Ih3lXbi4OJsbW+fl529Y+mKuHeR8NmrlfCnzxFCbA7n8tG/z+w9cO34SRaLjRBKjo6teI2Fna36B5emXgih3KxshFBESAhCyO/biVwzM/Ue8L0JABqUahrG6tvMT7Xs1J5Gpz27e+/Ezt25WdnQMNbKVxtGrpkZy8AAIcQ1NbW0t5XL5EX5+SqV6l3IS4TQ+Dkz6QwGQgguFZPa+1dv1B2h1V0SvFq1tHN2EgoE6YlJ1TxVaSdtu3dGCO34acWLwCAGi/mFQ4EqFOTkpCUk0ui0lJi4M3sPnNl7ID0xASH0aS+Siv576ps/eVnZCKF3z0MQQl3693Vt6q1uJ5msxtWphGR3Dz5XWlKCEOJaWFAoFPUjxqYmCKHykjKEkKCkBCFkamGhecrMFCEkFUsqze/BYDL/57/72Jad70JC34WE+vbsPn/DOiaLheMNkU9xYSFCKC7iXdx/665K1z/UmEwWQkghkyGEigv4CCEbB3s9hgWgUaimYay+zfyUo6vrit07/t65+/7lgIfXbo75Ydqo6VP1/lbIqlYNI0PTMMrFwnKJSEyl0SxtbfUYFuhKWXEJQsjM0rziEWNTU15ahqC0rJqnKu1k5qrlBoaGjwJuHvj192t/n/p599aK8XugesWFfISQQq64ffb8p4+rT0UqUX825XIZQqiokI8Qsm7E5ydkLQ9UKqX6B66JCULo04UqivILEUJGpiYIISMTk6KCgpKiIiMTLkKouKAAIcTmGHzeQFvZ2/3Pf1fM24jDG7eEPXn24ErAkG+go0uNcIwMEULTVyzrN3pErTY0tTBP+/+/FwCg/lRKlfqHahrG6tvMSpp38N125p+nt++e2PnHpSPH2/fq4ejmqpe3QnrVNIyp8Qlf2srA0IjJZknFkhJ+kYm5me5jAu1TKZUVP6tr71J+ccUjxfmF6k9oNU9V2iGTzZ6+fKnft5P+2ror6s2bf//Yt3Q79DerEQNDQ4SQqaXFvhtXarWhqbk5Qkg95UPjRL7OReo/dnZahnpsq7Wjg4W1dSm/KOzJM/UjDwOuI4Sa+7ZDCDVr3xYh9CjghnrbwAtXEELNfNtV7E0m1aw5ou7u4tO2zcDxYxBCvIz6LrvWeDRt0wYhFHj+UsUJR/y79zXZ0NnTAyF068xZ9cKQErFYx0kBaLA0DWN6+lcbxurbTDX5/zeMOZlZNBqt9/ChLTu2Rwjl83LwvUWSqWfDeOnIMfUUDlJoGMnDwNAIIcRLT0cIKZVKuVzerF1bhNDT23fVfRbePg/Jy842NjVx8nCr5qmKHcpkUoQQPz9fKhbbONhPmj8bIcRLh/OTmrJr4mRiYV5cUFixEnwJvyinBr9AZ29PhNCLu/czkpLVMw3IpPVddJJcyHf3wLNFiwdXAi4f/etN8BOpRLrt9InxP846tGGT/9r1Hi1aFOTkFPBybBwdeo8YhhAa+f3UN8FP7567GPv2HYVCSYmNozOZY2ZORwixDQwQQgW8nMzkFHsX562LljIYTAdXl9iICISQ+nMLaqKH36CgS1eyUtOWjvvG0dW1tKg4Lzt744mj6onDqjH4mwlBlwNSYuKWjJlg59wkNyNLX5EBaGg8W7XITkvbuXSljZODk7v77DUrv9QwUiiUatpM9QCD2Ih3YqGomM9fMXGKe4tmXFPTyJehdCbT2csD9xsljTo3jGNmTt+2+OdHATfDnj63sLHO+KwnOiAst2beVBrt/as3q6ZMFwnKV/vv7jpowJ3zlxKjopdP+s7S1jbxwweE0LjZM+kMRjVPVQzGe/citP/YURcOHn3/6rVH82bq+f182rXB/UZJg0qlTpw7+8jvW//Z9ce9i5cMDA2zU9NadGhfMS75S1p27ODZqkVC5Ic13890cHURlgka29Rt5Lt70HVQ/4Hjx3KMDDMTk424XIRQ9yEDF2z81cHFJfHDB6GgvOug/msO/KkezmXv0mTtgb0tOvjy0tOzUlOb+bZde+BP9XQ6hsbGHXr3MjLhJkXHSERin3ZtS4r4b58/N+QaT126qHP/vrjfKGmwDAzWHNzbZ+QwJpudHBMrFos69+9raGz01Q25pqar9+9p1r6dQqHk5+W369FVL3kBaIAm/DirTdfOCoWcl5ZuYm5afcNYzVNN27axcXJACOVkZirk8uYdfNPiEz68/ty16wAACOdJREFUfuPi5fXzzi3mVla43yhp1LlhbNmxw4KN6x3cXIVlZcIyQesunfWSF2iBtb39zP8tt7Cx4aWlqZRKBpvFZLFW++/p4TdYLBQmfvhg4+Q0+5f/qfubVfMUQqjHkEEGHMOM5GSEkL2zM53BfPs8RFRePmDs6G8XzMP9Rsmk59Ahizb/5urjXcjLzUhKsXV0bNWpY002/Gnr792HDGRzOAXZOY5uLlWuDNOAUVQq1eePRkUdRqi4eXOdd74/lZGbIZT0tTLV9YF0KoRfyqRS5rs54A7yH1ezC96VCgZbm9fgtY3RupjUe91a6eFAFy+OHz/+jR4OBHQtNHSNra2Xs3NPXR9oc1yaBZPR2uTrp5JEdju30NfUeKSdJe4g/7E3KVOFUCczLu4gRFQolZ3Lyjvl66PrAwkEOU+ebPXzC6jTthlPnszz89urg1z/kS2W/PwhaZGbo64PpFNChWJfctaVTi1wB/mPUrn8+7DYlZ5NcAchqGvZ+QNszPtZ6Xz0UWjoPlvbfs7Ofp8/Rb7ORUBH/vzfOrFIWOVTVnZ2+TxelU+xDTiLt2zQbpL49x+uHj9R5VMKhUIhl39pUqllO7aob8sCAED9qVSq7T8t/9KzFErV19cQQl4tW4z+YZp2wzy4ev1N8JMqn2IwmOpO6p8zMDRctOk37SYBQM/i372/+vfJKp+SSWUM5he/96ctX6rd2RHFQtGfq3/50rPVnCwNnjCudVcy3QmE8gBoRL0JEwoEVT7l3aZVXERklU+pR2JpVym/6H3o6yqfUqlUFbMxVvms1sMAABqzL7VFCCFTS8viL0xsoot5sbNT074Uhm3A+dLFHeOqJqQCgFyK+YVf+sfPZLGkki8OGhaVl2s3iUIhr6ZNqOZkqUPvXtpNomtQHgCNI0G3cEfQaN+rx78hwbhTAAAaOwqFQpy26LufFlasxQtAo9KxT2+CfBINjY0JkkTXyDc0GQAAAAAAAKAjUB4AAAAAAAAANKA8AAAAAAAAAGhAeQAAAAAAAADQgPIAAAAAAAAAoAHlAQAAAAAAAEADygMAAAAAAACABpQHAAAAAAAAAA0oDwAAAAAAAAAamMsDqgpRKXgjaAEV+++xKhQKEVMRhwGVqlKpcKcAoAoMagNoFxGdQqEgwr0PYqYiCCqFwqTA94aGCiEWhYY7Rf1RWMT7m6pUiE0lXCrioFMpCPfpCeY/jzmTUSqT481Qf8UyuRWLiTtFZeYMegn5f7c6UiKTs2lUCgXOEgARmdJpJTIF7hT1xZfKLZl03CkqaxhfOjpSLJUb0xvACbF2WDEZuRIp7hT1VSKTGxHvb2rCoJfIFHIl7lNgoiqQyq1YDLwZMJcHLhyWUKHEm6H+BHK5myEbd4rKXDgsEfl/tzqSL5G24HJwpwCgah5GBgI56U9hhXKlC4dwDaObIbtcQfrSS0cKJLIWxtAwajCpVA9DdpGU3J/EAom0OSH/pi25nHzyV186IiZA44m5PPDhGhrQqLFlQrwx6iO5XCRToXamxriDVNaEY+DAZr0tLsMdhIju5RVNdrLBnQKAqvWxMksTSUj93fmKX+phxLY3YOEOUllHM26pTJEhFOMOQkR38/hTmtjiTkEgo+0tg/L4uFPUS2Be0XfORPybjnOwCsovwp2CiJ4UFHc053IZmG+94u/7tamZ6+visqjSctxB6iKuTPi8sGR7CzfcQaq2pqlzUrk4vAgqhP/4Oy1nkbuDm6EB7iAAfNHeVh63cgozRRLcQeriNb8sRyJd4dUEd5Cq7Wzp9rCgOFEgwh2EQEQKxfFU3t7WHmwa/rMC4uhjZdbNgns5Kx93kLqQKVXHU3kbfVwtmJi7qVSpgxl3pJ3l2cxc3EGI5WlBiUypmudmjzsIwt8xlEqh7G/tuT4mNYRfYspgcIjXSa5KMoWKJ5HYsZl7W3vSCNyFfWdL9y1x6SfScjg0mgmDqmjEY/IM6bQUgYhBpUxxsmlvxsUdB4DqcBn0nS091sekFEjlLhw2Kbro0imoWCqXKFUuHPbm5gS9aIIQMqDR/mzlsSY6JTCP72TApjeEceB1Z0ijJghEhjTqci8nd7ho8pmxDtYUCuVSVp5YqXIyIEfPNEMaLalcyKZS57s5+BC4G+0gG3MVQhez8kUKpTOHVd6Iu0OzqJR8iUypUnkacZZ7OeGOgwhRHqj95uOSIRQnlYv5JBk0Zs6ku3HYTXB3DquJ/3k3yRSKU4SSAqmMFCcZOmJAo/pZmzfjcohczgFQwYhO29XSI7VcnCwUF5OhYaRRkAWT4cphORgQvWFkUqk7WrinC8XJ5PnS0RFjOm2knaW3kQFM1fAlY+yt+lqaJQlFuWKpmAyjaTk06nBb86bGHCrh/6aDbcx7WHATysXZIqlY2XjLAzoFWTIZzhy2A2E6ZBKlPEAIOXHYTmQ42yYjRw7bEX63AJCQiyHbhXgzHzQMTUhyiQdgZ8qk+zIJN8KwYTCk09uYGLUxwZ0D/Bf0MgQAAAAAAABoQHkAAAAAAAAA0IDyAAAAAAAAAKAB5QEAAAAAAABAA8oDAAAAAAAAgAaUBwAAAAAAAAANKA8AAAAAAAAAGlAeAAAAAAAAADSgPAAAAAAAAABoQHkAAAAAAAAA0IDyAAAAAAAAAKAB5QEAAAAAAABAA8oDAAAAAAAAgAaUBwAAAAAAAAANKA8AAAAAAAAAGlAeAAAAAAAAADSgPAAAAAAAAABoQHkAAAAAAAAA0IDyAAAAAAAAAKAB5QEAAAAAAABAA8oDAAAAAAAAgAaUBwAAAAAAAAAN+peeyMmJkMmE+g0DAACElp7+tKgoGXcKAMitnmcXUmlpRMQJ7cUBoJEqKkqyte1X5VNVlwe2tl2ZTGMdpwKgUWjTxht3BKAdzs7DyspScKcAoCGwsupZtw1ZLLPmzedoOw4AjZG7u7e5ebMqn6KoVCq95wEAAAAAAAAQEYw9AAAAAAAAAGhAeQAAAAAAAADQgPIAAAAAAAAAoAHlAQAAAAAAAEADygMAAAAAAACABpQHAAAAAAAAAI3/A6sjcBo0KzNrAAAAAElFTkSuQmCC",
+ "text/plain": [
+ ""
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ }
+ ],
+ "source": [
+ "# Visualize graph\n",
+ "visualize_graph(authoring_app, xray=True)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "fSaSq94CL0me"
+ },
+ "source": [
+ "Now, let's run the graph and check the results:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 20,
+ "metadata": {
+ "id": "uL-dS1MhL1Ak"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: Error: FileNotFoundError(2, 'No such file or directory')\n",
+ " Please fix your mistakes.\n",
+ "\n",
+ "assistant: Error: FileNotFoundError(2, 'No such file or directory')\n",
+ " Please fix your mistakes.\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: It seems that the documents needed for the analysis are not available. To proceed, I'll create a general outline for a research paper on the Transformer architecture based on common knowledge and existing literature. Once the outline is ready, we can save it.\n",
+ "\n",
+ "### Table of Contents for a Research Paper on Transformer Architecture\n",
+ "\n",
+ "1. **Introduction**\n",
+ " - Overview of the Transformer architecture.\n",
+ " - Importance and impact in the field of natural language processing (NLP).\n",
+ " - Brief history and evolution from traditional RNNs and LSTMs.\n",
+ "\n",
+ "2. **Background and Related Work**\n",
+ " - Review of previous models such as RNNs, LSTMs, and GRUs.\n",
+ " - Limitations of previous models that led to the development of Transformers.\n",
+ " - Key innovations introduced by the Transformer model.\n",
+ "\n",
+ "3. **Architecture of the Transformer Model**\n",
+ " - Detailed explanation of the encoder-decoder structure.\n",
+ " - Description of key components: Multi-head attention, positional encoding, feed-forward networks.\n",
+ " - Importance of self-attention mechanism.\n",
+ "\n",
+ "4. **Attention Mechanism in Transformers**\n",
+ " - Mathematical formulation and explanation of scaled dot-product attention.\n",
+ " - Role of multi-head attention and its benefits.\n",
+ " - Visual representation of how attention weights are calculated and used.\n",
+ "\n",
+ "5. **Positional Encoding**\n",
+ " - Need for positional encoding in a model without recurrence.\n",
+ " - Mathematical explanation and visualization of sinusoidal positional encodings.\n",
+ " - Comparison with learned positional encodings.\n",
+ "\n",
+ "6. **Training and Optimization Techniques**\n",
+ " - Discussion of training techniques like teacher forcing and label smoothing.\n",
+ " - Optimization strategies including the Adam optimizer and learning rate scheduling.\n",
+ " - Challenges faced during training and methods to overcome them.\n",
+ "\n",
+ "7. **Applications and Impact**\n",
+ " - Usage in NLP tasks like machine translation, text summarization, and language modeling.\n",
+ " - Expansion to other fields: vision, audio, and beyond.\n",
+ " - Impact on research and industry, including GPT and BERT.\n",
+ "\n",
+ "8. **Comparative Analysis with Contemporary Models**\n",
+ " - Comparison with models like BERT, GPT, and BERT-like architectures.\n",
+ " - Advantages and limitations compared to contemporaries.\n",
+ " - Evolution and modifications leading to the development of newer models.\n",
+ "\n",
+ "9. **Future Directions and Research Opportunities**\n",
+ " - Exploration of emerging trends and future research areas.\n",
+ " - Potential improvements and innovations in Transformer architecture.\n",
+ " - Discussion on scalability, efficiency, and real-world deployment.\n",
+ "\n",
+ "10. **Conclusion**\n",
+ " - Summary of key points discussed in the paper.\n",
+ " - Final thoughts on the transformative impact of the Transformer architecture.\n",
+ "\n",
+ "I will now write detailed content for each section and save the final results.\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: Outline saved to transformer_architecture_outline.txt\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: The outline for the research paper on the Transformer architecture has been successfully created and saved as \"transformer_architecture_outline.txt\". If you need further details on each section or any charts for explanations, please let me know!\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mNoteTaker\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "user: The outline for the research paper on the Transformer architecture has been successfully created and saved as \"transformer_architecture_outline.txt\". If you need further details on each section or any charts for explanations, please let me know!\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: 1. 1. Introduction\n",
+ "\n",
+ "2. 2. Background and Related Work\n",
+ "\n",
+ "3. 3. Architecture of the Transformer Model\n",
+ "\n",
+ "4. 4. Attention Mechanism in Transformers\n",
+ "\n",
+ "5. 5. Positional Encoding\n",
+ "\n",
+ "6. 6. Training and Optimization Techniques\n",
+ "\n",
+ "7. 7. Applications and Impact\n",
+ "\n",
+ "8. 8. Comparative Analysis with Contemporary Models\n",
+ "\n",
+ "9. 9. Future Directions and Research Opportunities\n",
+ "\n",
+ "10. 10. Conclusion\n",
+ "\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: Document saved to transformer_architecture_analysis_1.txt\n",
+ "\n",
+ "assistant: Document saved to transformer_architecture_analysis_2.txt\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: The in-depth analysis of the Transformer architecture has been successfully written and saved across two documents: \"transformer_architecture_analysis_1.txt\" and \"transformer_architecture_analysis_2.txt\". Here is a brief overview of each section:\n",
+ "\n",
+ "1. **Introduction**: Explores the significance of the Transformer model in natural language processing, highlighting its advantages over sequential models like RNNs, and its role as a foundation for modern AI models.\n",
+ "\n",
+ "2. **Background and Related Work**: Discusses the evolution from sequential models to attention mechanisms, leading to the development of the Transformer architecture as a superior alternative.\n",
+ "\n",
+ "3. **Architecture of the Transformer Model**: Describes the core components of the Transformer, including encoders, decoders, and the role of multi-head self-attention and feed-forward networks.\n",
+ "\n",
+ "4. **Attention Mechanism in Transformers**: Delves into the self-attention mechanism, explaining how it allows the model to focus on different parts of the input sequence simultaneously.\n",
+ "\n",
+ "5. **Positional Encoding**: Details how positional encoding provides information about the order of words in a sequence, compensating for the absence of recurrence.\n",
+ "\n",
+ "6. **Training and Optimization Techniques**: Covers strategies like the Adam optimizer and learning rate scheduling, which are crucial for efficient training and performance.\n",
+ "\n",
+ "7. **Applications and Impact**: Highlights the wide-ranging applications of Transformers beyond NLP, including computer vision and protein structure prediction.\n",
+ "\n",
+ "8. **Comparative Analysis with Contemporary Models**: Compares the Transformer with other architectures, considering computational efficiency and adaptability.\n",
+ "\n",
+ "9. **Future Directions and Research Opportunities**: Discusses potential areas for improvement and future research, such as reducing computational complexity and exploring new domains.\n",
+ "\n",
+ "10. **Conclusion**: Summarizes the transformative impact of the Transformer architecture on AI research and its future potential.\n",
+ "\n",
+ "If you need further details or specific charts for any section, feel free to ask!\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mDocWriter\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "user: The in-depth analysis of the Transformer architecture has been successfully written and saved across two documents: \"transformer_architecture_analysis_1.txt\" and \"transformer_architecture_analysis_2.txt\". Here is a brief overview of each section:\n",
+ "\n",
+ "1. **Introduction**: Explores the significance of the Transformer model in natural language processing, highlighting its advantages over sequential models like RNNs, and its role as a foundation for modern AI models.\n",
+ "\n",
+ "2. **Background and Related Work**: Discusses the evolution from sequential models to attention mechanisms, leading to the development of the Transformer architecture as a superior alternative.\n",
+ "\n",
+ "3. **Architecture of the Transformer Model**: Describes the core components of the Transformer, including encoders, decoders, and the role of multi-head self-attention and feed-forward networks.\n",
+ "\n",
+ "4. **Attention Mechanism in Transformers**: Delves into the self-attention mechanism, explaining how it allows the model to focus on different parts of the input sequence simultaneously.\n",
+ "\n",
+ "5. **Positional Encoding**: Details how positional encoding provides information about the order of words in a sequence, compensating for the absence of recurrence.\n",
+ "\n",
+ "6. **Training and Optimization Techniques**: Covers strategies like the Adam optimizer and learning rate scheduling, which are crucial for efficient training and performance.\n",
+ "\n",
+ "7. **Applications and Impact**: Highlights the wide-ranging applications of Transformers beyond NLP, including computer vision and protein structure prediction.\n",
+ "\n",
+ "8. **Comparative Analysis with Contemporary Models**: Compares the Transformer with other architectures, considering computational efficiency and adaptability.\n",
+ "\n",
+ "9. **Future Directions and Research Opportunities**: Discusses potential areas for improvement and future research, such as reducing computational complexity and exploring new domains.\n",
+ "\n",
+ "10. **Conclusion**: Summarizes the transformative impact of the Transformer architecture on AI research and its future potential.\n",
+ "\n",
+ "If you need further details or specific charts for any section, feel free to ask!\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: 1. 1. Introduction\n",
+ "\n",
+ "2. 2. Background and Related Work\n",
+ "\n",
+ "3. 3. Architecture of the Transformer Model\n",
+ "\n",
+ "4. 4. Attention Mechanism in Transformers\n",
+ "\n",
+ "5. 5. Positional Encoding\n",
+ "\n",
+ "6. 6. Training and Optimization Techniques\n",
+ "\n",
+ "7. 7. Applications and Impact\n",
+ "\n",
+ "8. 8. Comparative Analysis with Contemporary Models\n",
+ "\n",
+ "9. 9. Future Directions and Research Opportunities\n",
+ "\n",
+ "10. 10. Conclusion\n",
+ "\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: # 1. Introduction\n",
+ "\n",
+ "\n",
+ "\n",
+ "The Transformer model, introduced by Vaswani et al. in 2017, has revolutionized the field of natural language processing by addressing the limitations of sequential models. Unlike recurrent neural networks (RNNs), the Transformer architecture enables parallelization, resulting in significantly reduced training times. Its capability to handle long-range dependencies more effectively has opened new avenues for various applications in machine translation, text summarization, and beyond. As a fundamental building block for many state-of-the-art models, including BERT, GPT, and T5, the Transformer has established itself as a cornerstone in modern AI research.\n",
+ "\n",
+ "\n",
+ "\n",
+ "# 2. Background and Related Work\n",
+ "\n",
+ "\n",
+ "\n",
+ "Prior to the advent of the Transformer architecture, sequential models like LSTMs and GRUs were the mainstay for processing sequential data. These models, while effective, suffered from limitations such as vanishing gradients and difficulty in capturing long-range dependencies. Attention mechanisms emerged as a solution to some of these challenges, allowing models to focus on relevant parts of the input data. The Transformer architecture further enhanced this concept by eliminating recurrence entirely and relying solely on self-attention mechanisms. This section explores the evolution of these models and how the Transformer emerged as a superior alternative.\n",
+ "\n",
+ "\n",
+ "\n",
+ "# 3. Architecture of the Transformer Model\n",
+ "\n",
+ "\n",
+ "\n",
+ "The Transformer model is built on a series of encoders and decoders, each comprising layers of multi-head self-attention and feed-forward neural networks. The encoder processes input data, while the decoder generates output sequences. The absence of recurrence in the architecture is a defining characteristic, allowing the Transformer to achieve higher levels of parallelization during training. Layer normalization, residual connections, and dropout are employed to stabilize and optimize the learning process. This section provides a detailed breakdown of each component within the Transformer architecture.\n",
+ "\n",
+ "\n",
+ "\n",
+ "# 4. Attention Mechanism in Transformers\n",
+ "\n",
+ "\n",
+ "\n",
+ "At the heart of the Transformer is the attention mechanism, which assigns different levels of importance to various parts of the input data. Self-attention, in particular, allows the model to weigh the significance of individual words in a sequence relative to one another. Multi-head attention extends this concept by enabling the model to focus on different parts of the sequence simultaneously, capturing diverse contextual relationships. This section delves into the mathematical formulation of attention and its implementation within the Transformer model.\n",
+ "\n",
+ "assistant: # 5. Positional Encoding\n",
+ "\n",
+ "\n",
+ "\n",
+ "Since the Transformer model does not inherently consider the order of input sequences, positional encoding is introduced to provide the model with information about the position of each word in a sequence. This is achieved by adding a unique vector to each input embedding, which encodes positional information using sine and cosine functions of varying frequencies. Positional encoding allows the model to differentiate between words in a sequence based on their position, enabling it to capture sequential information without relying on recurrence. This section examines the role and formulation of positional encoding in the Transformer architecture.\n",
+ "\n",
+ "\n",
+ "\n",
+ "# 6. Training and Optimization Techniques\n",
+ "\n",
+ "\n",
+ "\n",
+ "Training a Transformer model involves several optimization techniques to ensure efficient learning and generalization. Key strategies include the use of Adam optimizer, learning rate scheduling with warm-up phases, and regularization methods such as dropout. The large-scale datasets used for training Transformers necessitate distributed training across multiple GPUs or TPUs to accelerate the process. This section explores these techniques in detail, highlighting their impact on the model's performance and convergence.\n",
+ "\n",
+ "\n",
+ "\n",
+ "# 7. Applications and Impact\n",
+ "\n",
+ "\n",
+ "\n",
+ "The versatility of the Transformer model has led to its application across a wide range of tasks beyond natural language processing. In computer vision, Transformers have been adapted for image classification, object detection, and more. The model's ability to capture complex dependencies has also been leveraged in areas such as protein structure prediction and music generation. This section discusses the diverse applications of Transformers and their transformative impact on various fields.\n",
+ "\n",
+ "\n",
+ "\n",
+ "# 8. Comparative Analysis with Contemporary Models\n",
+ "\n",
+ "\n",
+ "\n",
+ "The advent of the Transformer model has prompted comparisons with other contemporary architectures, such as RNNs, LSTMs, and CNNs. This section provides a comparative analysis of these models, highlighting the advantages and limitations of each. Factors such as computational efficiency, ability to handle long dependencies, and adaptability to different tasks are considered in evaluating the Transformer against its counterparts.\n",
+ "\n",
+ "\n",
+ "\n",
+ "# 9. Future Directions and Research Opportunities\n",
+ "\n",
+ "\n",
+ "\n",
+ "Despite the remarkable success of the Transformer model, there remain several areas for further research and improvement. These include reducing the model's computational complexity, enhancing its interpretability, and extending its application to new domains. The integration of Transformers with other emerging technologies, such as reinforcement learning and quantum computing, presents exciting opportunities for future exploration. This section outlines potential research directions and challenges that lie ahead.\n",
+ "\n",
+ "\n",
+ "\n",
+ "# 10. Conclusion\n",
+ "\n",
+ "\n",
+ "\n",
+ "The Transformer architecture has fundamentally reshaped the landscape of machine learning, offering unparalleled flexibility and performance across a multitude of tasks. Its innovative design, centered on self-attention mechanisms, has set a new standard for model architecture in AI research. As advancements continue to build on the foundation laid by Transformers, their influence is expected to grow, driving further breakthroughs in artificial intelligence and its applications.\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: The in-depth analysis of the Transformer architecture is divided into the following sections, each offering a comprehensive examination of various aspects of the model:\n",
+ "\n",
+ "1. **Introduction**: The introduction highlights the transformative impact of the Transformer model on natural language processing. It emphasizes the model's advantages over traditional sequential models such as RNNs, particularly in terms of parallelization and efficiency in handling long-range dependencies. The Transformer model is credited as a foundational element for several state-of-the-art models like BERT, GPT, and T5, establishing its importance in modern AI research.\n",
+ "\n",
+ "2. **Background and Related Work**: This section explores the transition from sequential models like LSTMs and GRUs to the introduction of attention mechanisms, which led to the development of the Transformer architecture. It discusses the limitations of previous models, such as vanishing gradients and difficulty in capturing long-range dependencies, and how the Transformer, with its self-attention mechanisms, overcame these challenges.\n",
+ "\n",
+ "3. **Architecture of the Transformer Model**: The Transformer model's architecture consists of encoders and decoders, with each layer incorporating multi-head self-attention and feed-forward networks. The lack of recurrence in the architecture allows for higher parallelization during training. The section elaborates on components like layer normalization, residual connections, and dropout, which are critical for model stability and optimization.\n",
+ "\n",
+ "4. **Attention Mechanism in Transformers**: The core of the Transformer model lies in its attention mechanism, which assigns varying importance to different parts of the input. Self-attention enables the model to weigh the significance of words in a sequence relative to each other, while multi-head attention allows for simultaneous focus on diverse contextual relationships. This section delves into the mathematical underpinnings and implementation of attention within the model.\n",
+ "\n",
+ "5. **Positional Encoding**: Positional encoding compensates for the lack of inherent order in the Transformer model by adding unique vectors to input embeddings to encode positional information. This section examines how sine and cosine functions of varying frequencies are used to provide sequential information, enabling the model to differentiate words based on their position without recurrence.\n",
+ "\n",
+ "6. **Training and Optimization Techniques**: The training of Transformer models involves optimization techniques such as the Adam optimizer, learning rate scheduling, and dropout for regularization. The section details these strategies and discusses the necessity of distributed training across GPUs or TPUs due to the large-scale datasets used.\n",
+ "\n",
+ "7. **Applications and Impact**: The versatility of Transformers extends beyond NLP, with applications in computer vision, protein structure prediction, and music generation. This section discusses the model's wide-ranging applications and its transformative impact on various fields.\n",
+ "\n",
+ "8. **Comparative Analysis with Contemporary Models**: This section provides a comparison of the Transformer model with other architectures like RNNs, LSTMs, and CNNs, considering factors such as computational efficiency, handling of long dependencies, and adaptability.\n",
+ "\n",
+ "9. **Future Directions and Research Opportunities**: Despite its success, the Transformer model presents opportunities for further research, such as reducing computational complexity and enhancing interpretability. This section outlines potential research areas and challenges, including integration with technologies like reinforcement learning and quantum computing.\n",
+ "\n",
+ "10. **Conclusion**: The conclusion summarizes the transformative role of the Transformer architecture in AI, highlighting its design centered on self-attention mechanisms. It anticipates continued advancements building on the Transformer's foundation, contributing to further breakthroughs in AI applications.\n",
+ "\n",
+ "The detailed documents have been saved as \"transformer_architecture_analysis_1.txt\" and \"transformer_architecture_analysis_2.txt\". If you need further elaboration or charts, feel free to ask!\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mChartGenerator\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "user: The in-depth analysis of the Transformer architecture is divided into the following sections, each offering a comprehensive examination of various aspects of the model:\n",
+ "\n",
+ "1. **Introduction**: The introduction highlights the transformative impact of the Transformer model on natural language processing. It emphasizes the model's advantages over traditional sequential models such as RNNs, particularly in terms of parallelization and efficiency in handling long-range dependencies. The Transformer model is credited as a foundational element for several state-of-the-art models like BERT, GPT, and T5, establishing its importance in modern AI research.\n",
+ "\n",
+ "2. **Background and Related Work**: This section explores the transition from sequential models like LSTMs and GRUs to the introduction of attention mechanisms, which led to the development of the Transformer architecture. It discusses the limitations of previous models, such as vanishing gradients and difficulty in capturing long-range dependencies, and how the Transformer, with its self-attention mechanisms, overcame these challenges.\n",
+ "\n",
+ "3. **Architecture of the Transformer Model**: The Transformer model's architecture consists of encoders and decoders, with each layer incorporating multi-head self-attention and feed-forward networks. The lack of recurrence in the architecture allows for higher parallelization during training. The section elaborates on components like layer normalization, residual connections, and dropout, which are critical for model stability and optimization.\n",
+ "\n",
+ "4. **Attention Mechanism in Transformers**: The core of the Transformer model lies in its attention mechanism, which assigns varying importance to different parts of the input. Self-attention enables the model to weigh the significance of words in a sequence relative to each other, while multi-head attention allows for simultaneous focus on diverse contextual relationships. This section delves into the mathematical underpinnings and implementation of attention within the model.\n",
+ "\n",
+ "5. **Positional Encoding**: Positional encoding compensates for the lack of inherent order in the Transformer model by adding unique vectors to input embeddings to encode positional information. This section examines how sine and cosine functions of varying frequencies are used to provide sequential information, enabling the model to differentiate words based on their position without recurrence.\n",
+ "\n",
+ "6. **Training and Optimization Techniques**: The training of Transformer models involves optimization techniques such as the Adam optimizer, learning rate scheduling, and dropout for regularization. The section details these strategies and discusses the necessity of distributed training across GPUs or TPUs due to the large-scale datasets used.\n",
+ "\n",
+ "7. **Applications and Impact**: The versatility of Transformers extends beyond NLP, with applications in computer vision, protein structure prediction, and music generation. This section discusses the model's wide-ranging applications and its transformative impact on various fields.\n",
+ "\n",
+ "8. **Comparative Analysis with Contemporary Models**: This section provides a comparison of the Transformer model with other architectures like RNNs, LSTMs, and CNNs, considering factors such as computational efficiency, handling of long dependencies, and adaptability.\n",
+ "\n",
+ "9. **Future Directions and Research Opportunities**: Despite its success, the Transformer model presents opportunities for further research, such as reducing computational complexity and enhancing interpretability. This section outlines potential research areas and challenges, including integration with technologies like reinforcement learning and quantum computing.\n",
+ "\n",
+ "10. **Conclusion**: The conclusion summarizes the transformative role of the Transformer architecture in AI, highlighting its design centered on self-attention mechanisms. It anticipates continued advancements building on the Transformer's foundation, contributing to further breakthroughs in AI applications.\n",
+ "\n",
+ "The detailed documents have been saved as \"transformer_architecture_analysis_1.txt\" and \"transformer_architecture_analysis_2.txt\". If you need further elaboration or charts, feel free to ask!\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n"
+ ]
+ }
+ ],
+ "source": [
+ "output = run_graph(\n",
+ " authoring_app,\n",
+ " \"Please do an in-depth analysis of the Transformer architecture and create a table of contents.\"\n",
+ " \"Then write at least 5 sentences for each section. \"\n",
+ " \"If charts are needed for detailed explanations, please create them. \"\n",
+ " \"Save the final results. \",\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "MfwGegFcMeg9"
+ },
+ "source": [
+ "## Structuring a Super-Graph\n",
+ "\n",
+ "This design implements a bottom-up planning policy. Although we've already created two graphs, we need to determine how to route tasks between them.\n",
+ "\n",
+ "For this purpose, we'll define a **Super-Graph** to coordinate these two existing graphs and add connecting elements that define how this higher-level state is shared between different graphs. First, let's create the chief supervisor node:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 21,
+ "metadata": {
+ "id": "LIaF15xaMfby"
+ },
+ "outputs": [],
+ "source": [
+ "from langchain_core.messages import BaseMessage\n",
+ "from langchain_openai.chat_models import ChatOpenAI\n",
+ "\n",
+ "# Create ChatOpenAI instance as the base LLM\n",
+ "llm = ChatOpenAI(model=MODEL_NAME)\n",
+ "\n",
+ "# Create team supervisor node\n",
+ "supervisor_node = create_team_supervisor(\n",
+ " MODEL_NAME,\n",
+ " \"You are a supervisor tasked with managing a conversation between the\"\n",
+ " \" following teams: ['ResearchTeam', 'PaperWritingTeam']. Given the following user request,\"\n",
+ " \" respond with the worker to act next. Each worker will perform a\"\n",
+ " \" task and respond with their results and status. When finished,\"\n",
+ " \" respond with FINISH.\",\n",
+ " [\"ResearchTeam\", \"PaperWritingTeam\"],\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "DVvZ62Y1MtbU"
+ },
+ "source": [
+ "Next, we'll define the state and nodes of the Super-Graph.\n",
+ "\n",
+ "The Super-Graph primarily serves to route tasks between teams."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 22,
+ "metadata": {
+ "id": "i6qRY6ReMulX"
+ },
+ "outputs": [],
+ "source": [
+ "from typing import TypedDict, List, Annotated\n",
+ "import operator\n",
+ "\n",
+ "\n",
+ "# Define state\n",
+ "class State(TypedDict):\n",
+ " messages: Annotated[List[BaseMessage], operator.add]\n",
+ " # Routing decision\n",
+ " next: str\n",
+ "\n",
+ "\n",
+ "# Node for returning the last message\n",
+ "def get_last_message(state: State) -> str:\n",
+ " last_message = state[\"messages\"][-1]\n",
+ " if isinstance(last_message, str):\n",
+ " return {\"messages\": [HumanMessage(content=last_message)]}\n",
+ " else:\n",
+ " return {\"messages\": [last_message.content]}\n",
+ "\n",
+ "\n",
+ "# Node for consolidating responses\n",
+ "def join_graph(response: dict):\n",
+ " # Extract the last message and return as a message list\n",
+ " return {\"messages\": [response[\"messages\"][-1]]}"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "Bz8asdRsMwEq"
+ },
+ "source": [
+ "### Defining the Super-Graph\n",
+ "\n",
+ "Now, let's define a Super-Graph that connects the two teams."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 23,
+ "metadata": {
+ "id": "fnPVUMA6MyQZ"
+ },
+ "outputs": [],
+ "source": [
+ "# Define graph\n",
+ "super_graph = StateGraph(State)\n",
+ "\n",
+ "# Define nodes\n",
+ "super_graph.add_node(\"ResearchTeam\", get_last_message | web_research_app | join_graph)\n",
+ "super_graph.add_node(\"PaperWritingTeam\", get_last_message | authoring_app | join_graph)\n",
+ "super_graph.add_node(\"Supervisor\", supervisor_node)\n",
+ "\n",
+ "# Define edges\n",
+ "super_graph.add_edge(\"ResearchTeam\", \"Supervisor\")\n",
+ "super_graph.add_edge(\"PaperWritingTeam\", \"Supervisor\")\n",
+ "\n",
+ "# Add conditional edges: Move to next node based on Supervisor's decision\n",
+ "super_graph.add_conditional_edges(\n",
+ " \"Supervisor\",\n",
+ " get_next_node,\n",
+ " {\n",
+ " \"PaperWritingTeam\": \"PaperWritingTeam\",\n",
+ " \"ResearchTeam\": \"ResearchTeam\",\n",
+ " \"FINISH\": END,\n",
+ " },\n",
+ ")\n",
+ "\n",
+ "# Set Supervisor node as the entry point\n",
+ "super_graph.set_entry_point(\"Supervisor\")\n",
+ "\n",
+ "# Compile graph\n",
+ "super_graph = super_graph.compile(checkpointer=MemorySaver())"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "XUlbzzb0Mzpw"
+ },
+ "source": [
+ "Let's visualize the graph:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 24,
+ "metadata": {
+ "id": "GzvNFQ1-M0dR"
+ },
+ "outputs": [
+ {
+ "data": {
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAB3AAAALmCAIAAAA8GZLoAAAAAXNSR0IArs4c6QAAIABJREFUeJzs3Xd8U/X+x/GTpHvTNmnT3dIBpWUXka2AgAwBUZYoCHjxCl6ueL3q1eu4CCqiiIobuS70iojXASgyCmpBRmkLHUAH3elK07RNM39/nGt+FVAR0p6O1/OPPE4Oyck79SG073z6/cpsNpsAAAAAAAAAAMDvkUsdAAAAAAAAAADQOVAoAwAAAAAAAAAuC4UyAAAAAAAAAOCyUCgDAAAAAAAAAC4LhTIAAAAAAAAA4LJQKAMAAAD4Batgs9ps/38s/Hxs+99ZW6vjK3+w7WofDAAAgPanePzxx6XOAAAAAEBKJ7RVbgonhUx278mDuyvPH9NWvXQuY2APlbvCadGxPTsrisYHRWQ31N6Tnnq+qWGAn3JHef5jp494OjmHuXutzj36yrnMIf7BrnLFbUe//baieFxQeKauZsXJ1HJDU1/fwP+Unnky+yc/Z9cQd88nc45sys8a7q+Wy2TzfvrmYHXZdcqw9Pqqv5w8WN1iSPIN+LAkd3XOUaWrR5Cbx6OnD79ecGpUYIjJal1wdM+PNeWjlaE/aTV/zThksJgTvHv8+3zOwerSnp6+nk7OUn8JAQAAugsZn+4DAAAA3dn6vBM2mW12WLyHwqnBbPLuPOVso8Vc3NQQ7OoZ5Oa+NvfoTSExQ3oESR0KAACgi6NQBgAAALqjQzVlP9VqFkX1bjSb3RQKqeNcrcqW5uN1mjnhcdUthmA3D6njAAAAdFkUygAAAEC3Y7Ja3y46fXNoT7kgkzqLg+3RFHs6OU8JjpI6CAAAQNfEpnwAAABA9/JZWX6jxXRLaGzXa5MFQRinCjdYLPYd/AAAAOBYTCgDAAAA3cj6M+nXKUMiPXykDtK26owtVpst3ttP6iAAAABdDYUyAAAA0F1YbbYao0Eu64KDyRfbU1XsLlNMC4mROggAAECXQqEMAAAAdAu5em2lobGPT4DUQdpPg9mkdvNwlXf6LQcBAAA6DtZQBgAAALq+Jov57cJT3apNFgTBVa4oaNRJnQIAAKBLYUIZAAAA6Pp0JqPeYuqGs7qflZ2L9/QbpQyVOggAAEAXwYQyAAAA0MVZBVtFS1M3bJMFQRivijALzNAAAAA4DIUyAAAA0MXt15SkVpe2z2vZbDZdvfaKn16vrXNoHMHLyfmaHkGOvSYAAEB3RqEMAAAAdHFak7G/r7IdXqhKU37zjUNPHPvxyp7+9mvPr7x7nqNDCXurS76vKXf4ZQEAALonCmUAAACgi5uijor29GmHF8rKOFZaUtS3f8ofepbFYhEPvvvmi95J/RyeysvJ5ZhW4/DLAgAAdE9sygcAAAB0ZUar5euKopGBIY69bE215oVnHk374YBCIR99/aSHH1+/9b3XNzz7mPinCoVib9pZNzd3i8Wy9d3XdnzyXnl5iYenV9/+Qx5fs9Hbx2/BrHEJickKheLbnZ+PnzT9wX8+O35E7wbd/9bKWLT0L8vufchRUS02m9lmDXHzdNQFAQAAujMnqQMAAAAAaENVRsPphjqHF8qrH11ZXlbyzIbNuvra8rISQRCmzZz33e7/BirV9z/8lEKhcHNzFwRh9T9X/nhw76qH18YnJH768b8//fgdVzd3i8VSWHCmsrL0748+s2zFgy0GgyAI/3jiuQf/uuSt979Uh4R7eTtynlohk7nK+cEHAADAMfi+CgAAAOjK/JxdxgSGOvyy587mJPTuOyhlmP2Mh4dX/tm8sROmBSr/twne/u+++vq/n2x6a9uga0aI+/XFxvdxcXEtLDhjNLasWPXPsTdMsz+9tOS8l7dvUt9BMpnMsVGtgm1d3onVfa517GUBAAC6J9ZQBgAAALoyT4VzP98Ah192wZ3LD+7fffeiGdmnT4pnCvLzGhsbEvsMsD/mPx9u7jsgRWyTBUE4nXU8Mbm/IAjnzmQLgnDDpBmtL3g680TvPv0c3iYLgmCyWv1d3R1+WQAAgO6JQhkAAADoypot5kdOpzn8srfMvfPf//nGbDbdddvU01knBEHIzkqXy+XxvZPsj8lMP9q33/826GtqaszLzkrsIxbKOaogtV8P/9YXPH0qPbENduQTBMFVrlgek9wWVwYAAOiGKJQBAACArsxd4WS0WutNRodfOaFX8vqX3zeajCdPHBEEIedUhjok3N3dQ/xTk8lkNLb0CFSKd7/ducNoMvaM6yUIwrkzubFxvVtfSltXW15aHBUd7/CQgiBUGZvTtdVtcWUAAIBuiDWUAQAAgC7uxX6j6k0tDrzgpx9t0Wprh1w7KnXfLicnp0FDhguCYLKYqjQVP6R+5+zinNw/xc3NPSo67sdDe2fMui03O+vlF/4lCILFYhEE4dyZ06Ovn9T6ghaLWRCEHw7uiY6JE+Sy3omOHFX+qVajdHF14AUBAAC6MyaUAQAAgC7OaLU0mE0OvGCVpnz7J+8uX3LLkbSDz27cEp+QJAjCrNkLVUHB9997+xP/+ItYHP/jiefraqonjU5++sm/3TpvsUKhOJuX3dJiKCkujInr1fqCAYGqW+cuPrB31913zjzyQ6oDowqCoHR1G94G2xICAAB0TzKbzSZ1BgAAAABtyCYI92UcfKRXitRBpOGlcHZVKKROAQAA0EVQKAMAAABd3/c15TKZLMHL75J/+vpLT3/84eaLz0dExZwvzL/4/Ief7g0OCWuDmP9zKPXbf/79novPh4aHlxYXX3z+sac2jL7+xkteakdZfkoPVV/fwDaICQAA0B1RKAMAAADdQpPF3GwxS52iXZUZGr8qL3wwYZDUQQAAALoOCmUAAACgW0irrTir104KjpI6SPtxkSu8nZylTgEAANClsCkfAAAA0C0M9Q92VTjl6bVSB2knJ+urCxrrpU4BAADQ1TChDAAAAHQjJqs1V1+ndvOUOkjbOlqnabaYp4fESB0EAACgq3GSOgAAAACA9uMsl2fW12hamvt13X3qyg2N44PC5YJM6iAAAABdEEteAAAAAN3L3PD4WqNBIZMZrRapszhYi9XyZsEpf2dX2mQAAIA2wpIXAAAAQDf1en6Wi0IxVR3dBcrXgsb6EHcvrbHF08k50sNb6jgAAABdFhPKAAAAQDf1p5ikIFcPF5m80WLaVVFU2KQTBMFis5lsVqtgEwTBbLOarVbxwWar1WS1CoJgEwSz1Wqx2QRBsNpsZuulH2y2/eEHWwWb2Wq1Xt6DzTZrZUtzpaFJJsg+Ks7brSkOdHFL9PGnTQYAAGhTFMoAAABA93VjcKSPs4va1dNN4ZSlq/Vycj5aV/HcmRPHaqvcFU6b8jNXZKTqzCZ3hdOKjNQns4+4K5yKmhpWZKRuPZ/rrnD6VlO8IiM1Q1vjrnB66dzJFRmpzWazq1yxIiP1qZyj7gqnc3rt8pMHntr7pbvCaWdl0YqM1NO6WneF0/q8EysyUi02q0wQVmSkPp17zF3hlNtQtyIj9fOyfHeF0+dl+SsyUnMb6twVTk/nHluRkSoTBIvNuiIjdX3eCQ+FU2Z9zedl5+rNRn8X1znh8f9IGOwiV0j95QQAAOj6WPICAAAAQBuqqqq65557/vOf/0gdBAAAAA7AhDIAAAAAAAAA4LJQKAMAAAAAAAAALguFMgAAAIC2FRkZKXUEAAAAOAaFMgAAAIC2VVRUJHUEAAAAOAaFMgAAAAAAAADgslAoAwAAAGhbLi4uUkcAAACAY1AoAwAAAGhbRqNR6ggAAABwDAplAAAAAAAAAMBloVAGAAAA0Lb8/f2ljgAAAADHoFAGAAAA0LZqa2uljgAAAADHoFAGAAAA0Laio6OljgAAAADHoFAGAAAA0LYKCgqkjgAAAADHoFAGAAAAAAAAAFwWCmUAAAAAbcvPz0/qCAAAAHAMCmUAAAAAbUur1UodAQAAAI5BoQwAAACgDclkMqVSKXUKAAAAOAaFMgAAAIA2ZLPZqqqqpE4BAAAAx6BQBgAAAAAAAABcFgplAAAAAG0rJiZG6ggAAABwDAplAAAAAG0rPz9f6ggAAABwDAplAAAAAAAAAMBloVAGAAAA0LbUarXUEQAAAOAYFMoAAAAA2lZ5ebnUEQAAAOAYFMoAAAAA2paXl5fUEQAAAOAYFMoAAAAA2pZer5c6AgAAAByDQhkAAAAAAAAAcFkolAEAAAC0rZiYGKkjAAAAwDEolAEAAAC0rfz8fKkjAAAAwDEolAEAAAC0LTblAwAA6DIolAEAAAC0LTblAwAA6DIolAEAAAAAAAAAl4VCGQAAAEAbkslkKpVK6hQAAABwDAplAAAAAG3IZrNpNBqpUwAAAMAxKJQBAAAAtK2IiAipIwAAAMAxKJQBAAAAtK3z589LHQEAAACOQaEMAAAAoA2xhjIAAEBXQqEMAAAAoA3ZbDaj0Sh1CgAAADgGhTIAAACAtqXVaqWOAAAAAMeQ2Ww2qTMAAAAA6IIGDRp0wRmlUrlr1y6J4gAAAMABmFAGAAAA0CZ69eola0UQhGuvvVbqUAAAALgqFMoAAAAA2sTcuXNdXV3td0NDQ2fPni1pIgAAAFwtCmUAAAAAbWLKlClRUVH2uwMHDuzVq5ekiQAAAHC1KJQBAAAAtBX7kHJISMicOXOkjgMAAICrRaEMAAAAoK1MmTIlOjqa8WQAAIAuw0nqAAAAAAAczCrYihobmixmqYMIgiCMWTC38uOPr5k1/ZSuVuosgiAI/i6uKlcPhUwmdRAAAIBOSWaz2aTOAAAAAMBhthRl76wsCnf3MtusUmfpcJwEWZ3J6OXkMkUdNV4VLnUcAACAzocJZQAAAKDr2HAm3SLY/hrbX+ogHZrZav26ssgi2CaqIqTOAgAA0MmwhjIAAADQRbxdeFqQyYYHqKUO0tE5yeXT1NH7NaXf15RLnQUAAKCToVAGAAAAugKdqeVIbcVQ/yCpg3QaowJDPi8vkDoFAABAJ0OhDAAAAHQF55v1Ps6uUqfoTPxdXEuaGkysNA0AAPBHUCgDAAAAXUGDyegs59v7P8ZF4aQzGaVOAQAA0JnwHScAAAAAAAAA4LJQKAMAAAAAAAAALguFMgAAAAAAAADgslAoAwAAAAAAAAAuC4UyAAAAAAAAAOCyUCgDAAAAAAAAAC4LhTIAAAAAAAAA4LJQKAMAAAAAAAAALguFMgAAAAAAAADgslAoAwAAAAAAAAAuC4UyAAAAAAAAAOCyOEkdAAAAAIAE9PXara88l/7DfkNTozI4dMLs26+76VZpI7325N+zjx3515ZPfXr4S5sEAAAAv4ZCGQAAAOiOXvnnqswj38f0Tu6hDMrLPF5RXCR1IqG+plrfoDWZWqQOAgAAgF9FoQwAAAB0O02N+swj36sjo594+z8ymayluanFYJA6lLBq/eumlmZ3T2+pgwAAAOBXUSgDAAAA3Y6zs7NcLq+vqamuKFOqQ13dPVzdPQRByPrph6fvvXPCrbcv+OvDgiCcyUp/Yumc0VNnLX149XfbP3pn3eOxffpVVZQ2NjSEx8ROX/TnQaPGihc8dzrz41efP5N5wtPbOz554Oy77wsKjzSbzQtHJgUEh8T0Ssr66UfBZrvr0bUvPrQicfDQh1/aIj7x0TtnFeaceu7jXZse/9u50xmCIGza+YOPn3/qV9v/++4bNeVlfkrVpDl33HDLAkEQSgvPbX3p2ewTR2QyWVzygLn3/C0irpcgCE/8ad6ZjOPXz5h9LPW7Bm3dm9/+JL4dAAAAOByb8gEAAADdjrOL64gbZzTpdf+88+bvtn9kNpsv84lN+oZJcxeNv3luxfmiF/5+T+7Jo4IgnDudsXrZ/OJzuaMmT08eMuJkWuqji2+tqSwTn1JTUVZdWXbz0uW33L0yZcz4yPjE00fTNGXFYg1dkJ01aNS4oPDIMdNm9QgMEp9SUXL+jdUPN+q0N962uPeAIc4uroIg1FZV/GvZ/Iy0g9eOn3LtDVOyjx9Zs2Jhg7bOnu3ogT03zlk4ef5i2mQAAIC2w4QyAAAA0B0teuAxf1XQzq3vvLPu8d3/effetRvDomN/91njZs27YdZtgiDEJvV/6R8r92z/KKHf4I83rTeZjH978vmQqJ6CIETG93p/w9r9X3x608K7BUFwdnV74Pk3vf16iFeYOPuO1//199QvPp31p5XfbntfEIQb598pCMJ1N926/7/b6qorBUFobtAJghCoDpt621I3j/+1w7v/876+Xnvrsvum3XGXIAhBYZEfvbzu208/mLl4ufiAP/1jbb9ho9ryawYAAAAmlAEAAIBuydnZZdbSe5/7ePeQ6yaUFeW/+NAKk8l4+U+PTx4oCEJZYb7NZsvLOC4IwpoVC5dPHbl86sj3N6wVBKG+tlp8pCo03N4mC4Jw7fgb/QJVqV/tqK+pPrxnZ1zygPjkARdcPLp30vCJUwuys1bOvH7b6xuaGxsEQSjIzhQEof/PlXHvAUMEQSg+m2d/Vkxi8tV9SQAAAPD7mFAGAAAAOiWDweDm5naVF+mhVN275sUn7pp7JvNE8Zm8y3jG/1gsJkEQBJvNZrNZzGZnV7eVa19q/YCg8EjxQKFQtD7v5Ow8fta8T17b8Mpjq0wm4+T5iy55/bsfWzd03OTP33l1x5bX0n88+MTb/3FxdRMEQSaTiQ+wWi2CIBiN/7+XoFzBuAwAAECb41suAAAAoJO55ZZbUlJS5syZ8/TTT+fk5FzZRc6eOtnUqBePvXz9BEFoMTS5e3oJglBfWyWe15Sc/7WnHz+4VxCE8Ng4uVweGhNnajHI5LJ+147sd+3IXv0H9Um5Njgs4teeO3b6HBc3t9PHDgeFRQwcOe6Sj7FarQOGj3nszY96JvYtzD1VXpQfHpsgCEL2iSPiAzIPHxIEITo+8crePgAAAK4ME8oAAABAJ/PJJ58MGjSopKTkk08++eabbxITE6dOneqSFH/5V7Bara/8c5Veq42I79Wkbyg+mxsWHRuXPMBobHH39D6yd/frLg+ZzaYje3dd8MSvPthcVniuprIi/fv9Ts7OE2bfIQjC9IXLXn70vhf+9uchYye4uLlnHflh+qK7R0+5+dde3cvXb8Skm/Z+9vGkOYvk8kvMuFit1n/eeUvPPsm+/oEVJUUubm4BQcETZ9+x59OtW19aV1aUbzKaDn613dPH9/qZc//IV+53aDSaI0eO7Ny585VXXnHgZQEAALoSCmUAAACgM9Hr9Vqt1mazyWQymUym0+nS0tKysrJ8U/r1WTjnMi9isZiHjpt0ZO/uc6cz/QICx988f8bie5ycnZ2cne958rkPNz5zeO/O0Oie42bO2/2fd1s/UaFQHNr5X6vVktB/8C1/+kt0Qh9BEIaOu1Eml3/x7puHv9vl6u4REtXTL1D12wEmzV54dN83oyZPv+SfNurqeyhV3+/6r8ViDYuJnX33fe6e3u6ewsMvv7P1lfWpX30mk8n6pFw7b8Xf/ZVBl/2V+1UtLS2HDx8+cOBATk5OcXFxY2Pj1V8TAACgq5LZbDapMwAAAAAQBEEwm811dXVarfY3bl1cXPz8/FqvdGGz2SIjI3tNm2gdmDg9LLaNsn23/aN31j1++6pHbph1Wxu9RPt7reBUvxPn0g9+r9FoNBqNuECzzWY7duyY1NEAAAA6KCaUAQAAgHai0+nsvXB9fb392M5gMPTo0cPPz89+26NHj8jISPsZPz+/8vJyg8Ewf/588Zru7u7XXHON1Wo9ffp0THKc1G+x8/niiy90peWtt/uTyWQfffSRUqkMCgoSb6XOCAAA0IFQKAMAAAAO0NLS0nqO+IKxYrE+9vDwaN0X+/n5RUdHDxgwwN/fXzzp5eVlv2BlZWVOTk58fLxard64ceOpU6c2bdpUXV39xBNPDBs2TBAEuVyuUqkWLVo0a9Ysm82WVlvxRUWhpF+DzsdkMs6dO/fg17tyc3MtFot40maz1dfXnzlzRvMzlUqlUqnsFbPqZ35+flK/AwAAgPbGkhcAAADA77tg6Yna2trWI8ZardZsNl8wXHzBrZ+fn5PTL+Y5WlpaFAqFk5PTV199ZTAYbr755pycnNWrVw8dOnT58uWffPJJVlbWokWLoqKijh496u3tnZCQYH/umDFjZsyYcdddd7m7u4tnfqwp/6KicJo6ut2/Np3YS3knYlJPDu/bX6PRvPHGG01NTTKZzMfH55lnnomMjLTPJldWVlZVVWl+qaqqqqmpSSyaVb+kVCo9PT2lfnMAAABtgkIZAAAA3V1TU9PF08Ria2w/7/czsR0WZ4pbn/mNArG2traxsTE8PPzs2bO7du0aPHjw0KFDV69effDgwS1btqjV6nXr1sXGxs6YMUOn0+l0OrVarVAo/ui7oFC+Aq8VnFqfPDzAxU0QhOzs7LVr1549ezYgICApKampqenFF1/MzMz86KOPJk+ePGzYsLKyMg8Pj9ZTyS0tLa2L5qqqKnv1LJPJgoKCWlfM9uMLPlcAAADoXCiUAQAA0JVZrVb7AhS/tiSFXC6/eJrY3hqLCxlfzmtptdqzZ89GRkYqlco333wzJydn/fr1paWlK1asuO6661asWJGZmZmbm3vttdeGhobq9frWC1xcPQrlK9C6UBZt3779888///e//y3eNRgMx44d8/DwGDBgwLZt2z744IPHHnusf//+r776qp+f39y5c41Go1wuv7gj1uv1Yst88XSzr6/vxS2zSqUKDAxs33cPAABwJSiUAQAA0Inp9foLauIL9rtraGjo8TNfX1/x4IL62M3N7TJe6n8sFotCoThw4EBzc/PEiRNzc3NfeOGFlJSUxYsXv/vuuxkZGX/+859jYmIOHDjg5eU1aNCgtnz3v0ChfAUuLpR/l9lsdnJySk1Nzc/PX7hwYUlJycKFC6dMmbJy5cqjR4+WlpYOHz78t6vhmpqa1rPM9unm2tpae8Vsn24Wz/j4+Dji7QIAADgAhTIAAAA6KJPJdMmt7VrPF7u5ubVeesLeF9vP+Pr6XsFLWyyW+vp6f3//wsLCgwcP9u3bt1+/fs8+++x333333nvvBQQEPPnkkwkJCfPmzRPLwcjISPtaxlKhUL4CV1AoX1JdXV2PHj1ycnJ27do1fPjwlJSUlStXNjQ0bNy4UaFQpKamxsfHR0VF/fZFLBbLBbPM9tLZZDK1Xj2j9d6Af+jjEAAAgKtHoQwAAABpNDQ0tJ4pFhcsVigUZ8+eFftig8Fwya3tWk8ZOzs7X00Gg8FQUlISGBjo5+f3wQcfFBQUPPLII6WlpbfffvuUKVP++te/Hj58OCsra+zYsVFRUZWVlX5+fq6uro77GjgShfIVcFShfDGj0VhcXBwREWGxWJ5//nknJ6cHHnhg7969n3/++bx586655prc3FwPD4/w8PDLuVpzc/MlV2rWaDSurq4XVMzicVBQkEwmc/j7AgAAoFAGAACA4xkMBu3PLli/2MvLKzMzs66uzt3dvfVMsXirUqm8vLzEu45dYjgrK6uqquq66647e/bsG2+8MXDgwDlz5rz66qvHjh176KGHevbs+eWXX3p5eY0ZM0Zc08CBL90+KJSvQNsVypfU0tKSk5Pj6ekZGxv74Ycf7ty588knn4yOjn788cfj4+PnzZun1+tNJtNlrtktqq+vv3gBDXG6OTAwsHXRbPeHrg8AAHABCmUAAAD8YfaCWOyIa2trWy9G4ezsLA7ztl59wn7X399fXMvY4aVtU1OTh4dHTU1NamqqUqkcMWLEjh07Pvzww/vuu2/o0KFPPPGEv7//ihUrKioqzp8/HxcX18VqNQrlK9DOhfKvSUtLq6qqmjp1am5u7oMPPjhx4sQ//elPu3fvrqiomDJlSkBAgNFodHFx+aOXvWTRrNFo9Hr9JYtmpVLp2E9xAABAl0ShDAAAgF9oaWm5eJu71msZe3t7a7Xa1mPF/v7+F6xi3KYLCldUVDg5OQUGBn7zzTenTp1asWJFaWnp3Xff3bdv36effvr06dPffvvt8OHDBw8eXFFRIZfLVSpV24XpOCiUr0AHKZQvYDAY3NzccnJyfvzxx5EjR8bGxs6ZM0epVL700kv5+fknT55MSUkJCwu74uubTKZLFs1VVVUymUypVMbExHh4eFywXvNVLi8DAAC6jM73q3wAAAC4GuIocevKuPWIsYeHR1FR0QUrF6tUqoSEBPuZgICA9oman59fV1c3aNCg6urq1157LTg4eMmSJZ9++uknn3yyYsWKwMDA5ubmmJgYuVyuVqu3bdvm4eEhCEJiYmJiYqJ4heDg4PaJCjiQuM9er169evXqJZ756KOPqqurBUFwd3c/f/68q6trWFjYiy++ePLkyVWrVvXp0yctLc3X17d3796Xc31nZ+ewsLBLVtJ6vV6j0dTU1JSXl2s0muPHj9urZ29v74u3BBTPOPoLAAAAOjQmlAEAALoOk8l08UCxuNldXV2d2Ww+f/68t7f3xTvdtR4x9vT0bOfYOp3u+PHjSqWyT58+H3zwwb59+5566qnAwMClS5f27NnzH//4R11d3bFjx2JjY6Oioto5WydyuK5yn6ZkjDJU6iCdyY6y/FVxA5SubThQ33bE/6N9fHwCAwPfeeedI0eOvPrqqwUFBRs2bBg/fvyUKVNKSkpsNttl7vv3u2pray/eElBsn8XPnFxcXOy7AooHLKABAECXRKEMAADQaej1+ksuQ1FXV+fj45Oent7S0nLxNndiWWw/L5fL2z+5xWIxmUxubm6HDh06e/bsnDlzbDbbkiVLfH19N23adPTo0a+++mrKlCmDBg3KyclxcnIS547bP2enVtKsfyL7yOKoRKmDdBotVsvrBac+TLlB6iCOZDabT506ZbFYBg4cuG/fvs2bN99yyy3Tpk3bunWrVqtdsGCBu7u7Tqdz4ALiVqtVo9FUV1dXVlZWVlaK+wGKtzabzd4vi7figLNKpVIoFI4KAAAA2hmFMgAAQEchThO3ronFlSjEk3V1da6urq0ni1sXxyLJ5wGbmppqamr8UzLMAAAgAElEQVTCw8Orqqp27NgRGRl5ww03bNq06dNPP33xxReTkpI2b94sl8vnz58vk8kqKyuDg4PplRzooVM/TggKd5Ozrt1lKWjS1RhblsckSx2kPeTk5Jw8eXLChAkuLi7z58+PjY1dt25dZmZmZmbmyJEjw8PDbTabTCZz7Is2Njba+2XxQBxw1mg0/v7+UVFRXl5e9nFm8cDX19exGQAAgMNRKAMAALQHi8Vi74jFg/r6+urq6tZ9sb0a9vf39/X19ff3v6AydnFxkfp9/I/BYCgsLPT09AwPD9+5c+ehQ4eWLl0aFRU1a9assLCwDRs2FBUVff/994MHD46Pj29oaPD29pY6creQ21C38dzJBRG9pA7SCTRazO8UZm8ZPNZJ1h1n4RsbGz09PUtLS7/++uu4uLgxY8Y888wzWVlZjz32WGxs7J49e0JDQy9zReYrU11dXVVVVVFR0bpu1mg0RqPxgnUzgoKCxGMnJz4pAQCgQ6BQBgAAcACj0WhfhkLc1UqsicX1i2traxsbG8WlJ+w1sXjQukSW+k1c2tmzZwsKCsaNG1dVVfXcc89FREQsX778yy+//OqrrxYsWDBs2LC0tDSLxTJo0CBxJzFIK7+x/uFTPw71Dw5283DpllXpb5PJ5PWmlnqT8ae6ytcHXu+poKP8fyUlJV5eXn5+fps2bSorK1u9enVmZubrr78+derUCRMmFBYWyuXyiIiINs1gMBhar5shHojHfn5+Yr/cs2dPd3d3cW/AoKAgPz+/No0EAAAuQKEMAADw+wwGwwXzxfamWFyhoqWlxV4QR0dHy+Vye00slsg+Pj5Sv4lfZbPZrFarzWb79ttvZTLZxIkTDx069Morr8ydO3fatGnr1683GAwPPfSQXq/PycmJjo5WKpVSR8ZvabSYPis9V2poqjI0SZ3lVxkMhtq6uhC1up1f11mu8HF2TvDuMTOkZzu/dGdkNptzcnIEQUhKStq1a9eHH3541113jRgx4tVXX5XL5YsXL7ZYLEajsX1+BaG6ulqsmHU6XVFRkX1vwObm5tYrNYsHwcHBrNQMAEAboVAGAAAQDAZDbW1t6474gu7YZrP5+fmFh4e7uLhc0BR3kMWLL4fJZKqvrw8MDMzPz9+3b9+gQYP69+//xBNP7N+//7PPPnNycnrllVf69es3ceLEqqoqs9msbveyD93KjTfeuG3bNg8PD6mD4A87efLk6dOnb7755rq6uiVLlowdO3blypWpqalVVVXjx49v58/PDAZD64nmiooK+21AQIA4xWxvmcWDTvE3NgAAHRaFMgAA6PpaWlrEXti+DEXr24CAgDNnzlzQEdubYvG4c3VeZrO5vLw8ICDAw8Njy5Yt1dXV999//7lz55YtWzZt2rQVK1akpaWdOXPm+uuvDw0N1Wq1/MI4JNHY2Oji4uLs7Cx1EFwtvV7v5eWVlZW1b9++iRMnxsXFzZkzJzQ0dP369ZWVldnZ2YmJiSqVqv2DiSPM9n65srJSoVCkp6fbbDaxXLYT7/bo0aP9QwIA0OlQKAMAgE7PZDLV1dXV1NS0nimWy+VnzpwR75pMJrEXtpfFFyxk3Ln6YjubzVZSUmKz2SIiIn766addu3aNHTt22LBhDz/8cHFx8bp164KDgz/++OPg4ODRo0ebzWa2tEKH0tzc3NLSwucZXZLRaCwuLu7Zs2dlZeVrr70WFha2ePHibdu2HT9+/M4774yNjS0oKFAqlVJNCuv1enE/wNYqKiqamprsRbP9QFxJQ5KcAAB0TBTKAACgo7NarWJTXFNTo9frKysrW88X19bWGgyGHj16BAQEiE2xOGWsVCp9fHw60XoUv62qqqqoqCgxMdHZ2fnZZ591dnZ+4IEHDh48+MYbb9x8883Tp0/PyMioqqoaNGgQ9Rw6C71eP2fOnC+//FLqIGgnOp0uKysrNDQ0MjLyueee279///vvv+/j4/PSSy/17t37hhtukPxzr5aWltZFs32uWaPRBAUF9e/fXxAEe9cs3rJMMwCgG6JQBgAA0tPpdDU1NeI0sXgg3op9cX19vThHHBAQEBkZ6eTkJB6LxbG/v3/77AfVPiwWS3p6uslkGjp06PHjx995550JEyZMmTLlueeeq6ysfOSRRzw9PQ8cOBAdHR0TEyN1WOBqvfXWWxMmTAgPD5c6CKRhtVplMtmOHTu0Wu2iRYuys7MfeOCBWbNm3XHHHZmZmY2Njf3793dzc5M6pmCz2cSKuby8XCya7bcBAQGtJ5rtty4uLlKnBgCgrVAoAwCANmff8s4+aFxXV1ddXW2vjP38/Ly9vcWa2L4khV1XHbmtqKjIyMiIj4+Piopav3790aNHN2/e3NTUtGbNmpSUlDlz5pSUlGi12tjY2I7QpwBAO2hoaKivrw8LC0tLS9uxY8ett946cODAhx9+2M/P74EHHtDr9bW1tREREVLH/H/iCLN9otneNXt5ednLZfFArVarVCp3d3epIwMAcLUolAEAgAOINbF9uLi2ttZiseTn54uVsdVqvbgsbn23C6/ta7Va5XJ5ampqXV3dTTfdlJ+f/9hjj/Xv33/VqlX//e9/s7Ky5s2bFxUVlZOT4+vrq1arpc4LtKvy8vKPP/545cqVUgdBh5afn5+fnz9u3LjS0tKHHnqoV69eDz/88IEDBwoLCydOnNgxFziura1tPchcWVnp7Oz8008/ubm5Bbdib5z54BAA0IlQKAMAgN/X0tJSW1tbU1PT0NBQWVkpVsbiGXHW2M/PLyAgQOyIxbWMg4KCfHx8xKUqPD09pX4Hbc5kMjk7O5eXl6elpcXGxiYnJ2/YsGHPnj3iblRPPvlkz54958+fr9PpxOE7mUwmdWSgQ7jllls2bdqkVCqlDoJOw2azyWSy/Pz8AwcOpKSkJCUlLVmyxGw2v/766xaL5fDhw7179w4ODpY65qVptdqKS3F3d1er1a3HmcVjls4AAHRAFMoAAEAQBKG+vl4siC9oisURY6PR6O/vHxgYGBcXZ7VaxcrYvgleQEBAt6pHxe54z549Go1m3rx5JSUlf/nLXwYMGPDII4+kpqaeOHFi8uTJsbGxZWVlPj4+XWA/QKBNmc1mmUzGzma4SoWFheHh4Uaj8dlnn3V1dX3wwQdTU1P37ds3c+bM5OTk6urqwMBAqTP+lrq6uvLycvuiGfZjb29vsVyOiYnx8fEJDg5Wq9VqtVoul0sdGQDQfVEoAwDQXYhlcU1NTXV1tdga2w+am5sFQbB3xOJtjx49AgMDxYnj7tmKtrS0uLq6VlRUpKamxsTEDB48+JVXXvnss8/eeOONmJiYl156KTAwcO7cuQaDobGxMSAgQOq8AID/p9frf/rpp8DAwOTk5Jdffnn79u1btmyJiIjYunVrdHT00KFDpQ54WWpqasRyWafTFRQUlP9MpVKJI8zqnwUHBzPpDwBoHxTKAAB0ERaLRayMq6ura2pqmpubS0tLxWOxRxaniQMDA+2tceuDLryK8eUQu+MDBw7k5eUtXry4trZ2yZIlCQkJzzzzzNGjR48cOTJ+/Pi4uLiamhpfX99u/rUCHKulpWXatGm7d++WOgi6OKvVajKZXF1d33vvvfLy8gceeCArK2vDhg0zZsyYPHny+fPnXVxcOuxCGRezTzGL/bJ4UFdX17pftt/6+vpKnRcA0KVQKAMA0GmYzWb7fHHrA5FOpwtoJTQ01MPDIzAw0N4jSx2/QzCbzU5OTvX19Xv27AkMDBw9evQHH3ywefPmdevWDRw48O2333Zzc5s3b57ZbG5sbPTz85M6L9AtvPzyy5MmTerZs6fUQdDtnD171mg0JiYm7tq1a8uWLcuXLx8xYsTmzZt9fHxmzpxps9k612IsZrPZPsIsrpghHpvN5gEDBshkspCQkJCQkODg4JCQELVazeejAIArQ6EMAEAHYrFYxJlicay4qqqqphW9Xj9kyJD6+npxrFgsi+0Nco8ePaSO37EYDAY3N7fMzMz09PSJEycqlcr58+fL5fL33nvv7NmzO3fuHDlyZP/+/bVarbe3d+eqDAAAbcFisSgUioMHD548eXLZsmW1tbV33XXX5MmTly5dmp+f39zc3KtXr87470VjY2NlZWVpaWlZWZlYMYsHAQEBarW6dcscEhLCh6kAgN9FoQwAQHsTK+Pq6mp7dyyXy7Ozs8UpY3FJClHr44CAAH7G+zViBfD1118bjcbp06cfPXr08ccfnz9//ty5c7dt21ZfXz979mwvL6+OvykT0D1VV1f/+OOPU6dOlToIcCGtVltdXR0bG3v8+PG333576tSpEydOfOuttxQKxYIFCwRB6NRDvuK6GWVlZWVlZRUVFWLL3NLSIs4vty6aQ0ND2QYQAGBHoQwAgOPV19dXV1drtdqKigqxMrZ3x9XV1fay2N4Xq1Qqce87powvR3FxcXp6elJSUnR09OrVq7///vt///vfAQEB69ev79Onz+TJk3U6nVwu754bCQKd1NSpUz/88ENvb2+pgwC/LyMj48SJE7Nnz25qarrlllumTp26cuXK3NxcvV7fp08fNzc3qQNelaamJrFZbj3OXFVV5efnFxoaGvIz8djDw0PqvAAACVAoAwBwJVpaWqovUlNTU1VVVV1dLS5e3KdPH4vF0nrWWCyRpc7eaVgsluPHjxsMhpEjR2ZkZDz//PMjRoxYsmTJ9u3bCwoKZs+eHRYWVlpaGhAQ0Nl/egeQlZWlVCqDgoKkDgL8MRaLpaKiIjQ0ND09/b333hszZszUqVPff/99vV4/d+7crrQbXkVFhbhohkg8dnV1DQ0NVavVoaGh9gM+HQeALo9CGQCAXyX+omt1dbVYE4u3rq6u6enpRqMx8Nc5OztLnb2T0ev1OTk5YWFhwcHBL7zwwokTJ956662Ghoann346JSXl1ltvra6u1uv1ERER/MotAKCDO3v27NGjR8eMGRMcHDx9+vSIiIiNGzfW1tYWFhb26tWrK0311tXVlZaWlpeXl5aW2g+MRqPYLIuDzOHh4cHBwXxcBABdCYUyAKBbM5vNYlNcVVVVV1dXWVkpNsgajaa6utrb2zsgIECpVIo1sXgg3vJ72VespqamrKwsOTm5oqLi5Zdfjo2NXbhw4fvvv5+enn7PPfdER0enp6f7+/tHRERInRRA+6mqqvr3v/99//33Sx0EcLz8/PyYmBiNRvP000+HhoauWrVqz549hYWF06ZNU6lUUqdzvMbGRrFZFmeZTSbTiRMn6urqxCnm8PDwkJCQsLCwkJAQWmYA6KQolAEAXV9zc7O9Nb7gQNwET6lUKpXKkJAQHx8f+93AwMBOvdNOR6DX6+vr60NDQ48ePfr111/PnDkzKSlp6dKlQUFBq1evrqqqys3N7dWrF8uAABAEYfbs2Zs2bQoICJA6CNDmiouLDxw40L9//6SkpLvvvtvV1XXDhg06na6ioiI+Pl7qdG3CaDSKU8wlJSXibVlZWV1dndgsh4eHh/6sS5bsANDFUCgDALqIpqamqkuRyWSlpaWta2KVSmWfOPb395c6eBdhsViam5u9vLy+++67Y8eOLV++XKfTLVmyZMaMGYsXLz5+/LhWq73mmms8PT2lTgoAQAdiMBjOnDmTnJxcWVn56KOPJiQkrFq1avfu3Y2NjZMmTXJ3d5c6YBsyGo1is2xvmUtLS8WPou2zzBEREWFhYXzDBgAdCoUyAKAzMRgMGo1Go9FUVVVptdrS0lJ7cWyxWJRKpVgW2w/EWx8fH6mDd0F6vX7//v3e3t6jR4/+6KOP3nzzzaeeemro0KGff/65s7PzhAkTZDIZ6x0DuHxVVVXNzc0sdwOI21Tu3bt35syZarV64cKFQ4YMWbFihU6nk8lkXX7RrZaWltKflZSUnD9/vqSkxGg0ilPM4T8LCwvr2m07AHRkFMoAgA7HYrHYW2PNL5lMJrEsViqV4eHhPj4+4tyxSqXy8vKSOniXVVNTk52dLa5NsXLlyoaGhrfffjs3N/fLL78cPXr04MGDdTodrT2Aq3T27Nmnn376rbfekjoI0LGUlJSUlJQMHTo0Jyfnb3/7280337xw4cK0tDSFQjFgwIBusjyXXq8Xvw7FPyspKfHy8goLC2tdMYeFhUmdFAC6BQplAIBktFqtWBNXVlbaD6qqqmpraxMTE+VyuVKpDAoKEutjlUqlUqloLduaXq/XaDQxMTElJSWvvfZaQkLCggUL3n777YKCghUrVgQFBeXm5oaHh3elHeoBOJDVar2ap2/btm369OltUZDxCxPoMsQPcb///vsdO3YsWrQoMTFxzZo1/fr1mzx5stls7ib9sqiqqspeMZf8LDw8PCkpydfXNyIiIjw8PCoqipkDAHA4CmUAQNsSx40rKysrKio0Go1Wqy0uLhYbZDc3N7EmFltj8VZcpELq1N2F0Wh0cXE5dOhQenr6woULnZycZs2aNXTo0EceeaS8vPzcuXN9+vTp0aOH1DEBdA4tLS319fVSp7gEmUymVCqlTgG0lUOHDpWWls6ePTstLW3Tpk233377uHHjysrKVCpVt+qXBUGw2WzFxcXl5eUFBQVFRUXFxcWFhYUuLi4RERGRkZERPwsKCpI6KQB0bhTKAADHaGpqEivjyp9pNJqKiora2lqxLA4ODlapVCEhIeLSxiqVys3NTerU3YvNZjt58mRTU9OwYcP27t27cePGe++99/rrr9+6dauzs/P06dMVCoVMJpM6JoDO6uoLZaPRKAiCi4uL40IJFMroVs6fP9/c3JyQkPDJJ5+8/vrrL730Uu/evffs2RMfH99tFyivra09f/58UVHR+Z/pdDpxftneMkdGRvJ7DABw+SiUAQB/TGNjo9gXV/ySyWQKDg4OCgqyjxuL+BleKg0NDTabzcfHZ+vWrWlpaY8++qibm9sjjzwyatSomTNnVlVVOTs7+/n5SR0TQNdx9YWyyWQyGAwO33OMQhndVlNTk4eHx0svvVRUVPTcc8+dPHnyyJEj48ePj4qKkjqalAwGgzi/bG+Zi4qKhg8fLghCVFRUVFRUdHR0dHQ0n7IDwK+hUAYAXJrRaCwvLy8vL6+srBQPFArF0aNHLRaLOGsc/Eu+vr5SR+7uGhsbDx486OXlNWLEiHfeeWf79u1PPfVU37599+/f36NHj759+/JzEYA25ZAlLywWi0KhcFCi/6FQBkRarfbrr78OCgoaO3bsm2++qdFo7rrrLv7vEJWWlubn5xcWFhYWFhYUFBQUFERGRkb9ktQZAaCjoFAGAAiVlZVlZWXV1dVFRUX2+rihoUGtVqvVarEvVqvVISEhQUFBbIvXQZSUlLi4uKhUqg0bNuTl5W3atCkvL2/79u033HDDwIED9Xo9W9AAaGesoQx0Ilqt9tixYwkJCWFhYbNmzUpOTn7sscdqampcXV35FkJU+Evnz5+Pjo62jzBHRUWFhYVJnREApEGhDADdSFNTU1lZWXl5eWlpaUVFhXhcVlamVCrVanXv3r09PDzUanVQUJBarQ4ICJA6L35Bp9Pt27fP29v7+uuvf/fdd7/55pu///3vycnJhw8fDgsLCw0NlToggO7u4kJ53rx5Wq229ZmBAweuXr362WefLSoqeuWVVwRBWLZs2erVq+3bser1+kWLFs2ZM+fmm28uKytbsmTJo48+eu211wqCsGvXrm3btmk0Gi8vrxtvvPG2224TX2LUqFHLli2zv8T999/v5eX1+OOP289QKAO/zWw25+TkJCUlZWZmPvzww/fee+/48eNPnjwp7oEhdbqOwmq1isPL4q1er8/Ozu7ZSkxMjKenp9QxAaA9dK8tXwGg+9BqtWVlZaWlpaWlpeKBWBxbrVa1Wh0aGhodHT1s2LCQkBC1Wu3wXy7G1aurq3N1dXV3d//Xv/5lMBjWrFkj/vQyduxYsUC5/fbbxUdec801UocFgF81ZMgQ8S8ukb+//wUP0Ol069evX7Nmjbgsz6/9k5SWlrZx48ZZs2YlJSUVFRV5eHi0cXCgG3FyckpKShIEITk5+YsvvtDpdIIg5Ofnr127ds2aNTExMTt37hwwYEA3L5flcnlMTExMTIz9TEtLy7mf7d2799y5cz4+Pj179oyNjRUr5pCQEEkjA0BboVAGgE5Pp9MVFxeXlpYWFxc3NzdnZWWVlpYqFIrQ0NDQ0NCQkJCUlJSbbropNDSUHdg6MpPJdOTIEavVOnLkyNdee2337t2vvvqqm5vbiBEjEhMTBUHo27dv3759xQc7OfEvOIDOITg4eOTIkb/9mJMnT3722WczZ84UBMHd3f2Sjzl8+LBarb7zzjvFkrptwgIQBEEQ1zebMWPGjBkzTCaTIAi5ubn79+9/5plnTpw4UVJSMnbsWD7UEQTB1dU1MTFR/D5NVF5efu7cubNnz37zzTfnzp2rq6sTy+XevXtHRkbGxcUxxgGga+DHUQDoTFpaWoqLi0tKSoqLi+0lsiAI4eHhoaGh4eHhCQkJ48aNCwsLY/G7js9qtQqC8O6779bW1t533325ublff/31pEmTBEG47bbb7L++ff3110udFADaVv/+/d99993+/fu3Hv27gJubm1arbWho8Pb2bt90QLfm7OwsCMLKlSvFu0FBQXv37lUqlUOHDn3//ffF/f3kcrnUMTsKcQOSESNGiHebm5vF+eXi4uIdO3bk5eVFRUXFxcXFx8eLt66urlJHBoArQaEMAB1XVVWVWBwXFRWJB3V1deHh4XFxcYGBgYMHD77pppvCw8PZJa+zMJlM1dXVarV669atO3bs2Lhxo7hQtfjL4ElJSU899ZT4SD4PANBl2Gy2lpYW8djZ2fmSxVNycnJMTMy6des2bNjg6up6yV1ebrjhhl27dt1///233XbbiBEjxPUxLn4J8W7bvBUAQkhIyKpVq8Tj+Pj4b7/9NiUlxcnJacuWLTfccEN8fLzUATsWd3f3pKQkcTmRJUuWCIJw7ty5M2fO5OXl7du3z8nJqb6+Pj4+vnfv3gkJCb169ZI6LwBcLgplAOgoysrKioqKCgsLi4qKioqK3NzcSkpKwsPDIyIiIiIirr322oiIiKCgIKlj4o8pKSnJyckZN25cYWHh0qVLly9fftNNN/Xt2/f6668X/2suXLhQ6owA0Ia++OKLL774Qjx+6KGHfm35izvuuOPEiRPvvPOO+PsZF5fC0dHRL7zwwttvv7127dro6Oj77ruvZ8+eF7+EiDUxgHYwZMgQ8f81i8USHByclpYWHx+/Y8cOk8k0depUNzc3qQN2ROLayhMnThTv5ufn5+Xl5eTk7Nq1Kycn59prr1WpVElJSb1792a/ZQAdGYUyAEijoqIiPz+/oKCgtrY2IyOjqKgoMDAwMjIyKiqqf//+06dPj4iIYHG6Tqq4uPjgwYNjx44NCgp67LHHkpOTx40bFxwc/O2334oP6NOnj9QZAaCdjBo1avr06eLxb+xP5ezsfP/99993332DBw+WyWStB5DtoqKi/vWvf50+ffqFF1545JFHNm/eLC643PolBEHYuHFj27wVAJemUChmzZolHg8cOPCLL744f/58fHz85s2bU1JSkpOTpQ7YcYm7/Nn75by8vNOnT6elpb3zzjuNjY2JiYkDBw6MjY1NTk5m8WUAHQqFMgC0B61WW1hYePbs2YKf+fr6xsTEREdHJycnT5w4MTIy0sXFReqYuHIGg+H999/38/ObNWvW/v37m5ubxWUr3n77bfEBzOkA6J78/Pwu8/e4o6Oj77jjjhdeeMFkMomrzF9SYmLiqlWr7rvvvqysrJSUlItfgo9jAQlFRETcc8894rFKpdq+fXtycnJWVlZlZeV1113Hasu/LT4+Pj4+XvyETKvVnj59uqCgYMuWLZmZmb169UpOTk5KSkpOTvb19ZU6KYDujkIZANpEfn5+fn7+uXPnxFuz2XzNNdd4e3snJiZOnjw5Ojr613axR2dhs9lkMtnWrVsPHTq0ceNGnU7n5uY2bNgwQRAWLFggdToA6JRuuummI0eOpKenm83m330w83pABzdlypQpU6YIgqBUKj/77DNx34jvv/9+wIABfPDzu/z8/IYNGzZs2LD58+cLgnDq1KnMzMxdu3Z98sknOp2uX79+/fr169u3r7ghBwC0MwplAHCMvJ+dOXNGLpc3NzfHxMT07Nlz5syZPXv2DAwMlDogHOb48eNbtmxZsGBBSkqKh4fHn//8Z4VCoVKpbrvtNqmjAUDnJpPJ7rvvvj//+c8XjzH++OOPX3311dChQ2Uy2fbt20NCQhITEyWKCeCPCQoKevTRR8Xjc+fOPfXUU19++aVOp/Py8nJyopS4LH369LGvmVZYWJienn7w4MGXX345MTExICBg4MCBAwcOZFdnAO2Gv7sB4AqdO3cuJycnJyfn/PnzGRkZcXFxCQkJcXFxN954Y0JCwiUXf0QnZbFYFArFI488YrFY1q5dK5fLly5dKi4IeNNNN0mdDgC6lMDAwOXLl1dXV19w3tnZWafTiesm9+vXb+HChSwlBHRGt99+++23326z2ZqammbNmvX444+PGDFC6lCdTFRUVFRUlLgyRklJybFjx/bs2bNu3brQ0NARI0YkJyf369dP6owAujjZxRsoAwAuqbKy8tSpU9nZ2bm5udnZ2REREb1+FhcXJ3U6tIn//ve/n3766fPPPx8QEHDo0KGhQ4cyRwMAv6alpaW+vv7qr2Oz2Zqbmx34G/EymUypVDrqagAcqKioKDIy8tFHH1Wr1cuWLWOR5auRk5Nz6tSp1NTUvLy8IUOGDB8+PCUlpUePHlLnAtAFUSgDwG/Jy8vLyMg4derUqVOnZDJZnz59evfuLZbIzs7OUqdDm9DpdBs3boyOjp4/f/6BAweioqIiIyOlDgUAnYCjCmVBEGpqahy4MCiFMtDBNTY2fvXVV5MnT9ZoNDk5OZMmTR7yM44AACAASURBVJI6UedmMBgOHz6cnZ29e/dutVo9fPjwYcOGRUdHS50LQNdBoQwAv2A2mzMzMzN+Nnr0aD8/P3HNMpVKJXU6tKFDhw7t2rVr9erVJSUlOTk5o0eP5jMDAPhDHFgom0wmB/4lTKEMdBbNzc0bN26Mj4+fMWNGXV0d07VX7/Tp0z/88ENOTk5lZeWoUaNGjRqVkJAgdSgAnR6FMgAIgiBkZmYeP3782LFjjY2NPj4+fX9Gpdjl7d+/39fXd8CAAa+//nq/fv2GDh0qdSIA6KwcWCg7FoUy0LlYrVa5XP7Pf/7Tw8PjwQcflDpOF1FSUnLgwIHU1FR/f/+YmJixY8fGxMRIHQpAZ0WhDKD7Ki8vP3LkSH5+/q5du+Lj4wcOHDho0KC+fftKnQvtITs7u3fv3t9///2ePXuWLVsWFBQkdSIA6PQcWCg3Nja6uro6atl6CmWgkzp06NCIESMOHDjg6+vbv39/qeN0ERUVFd999913330nk8kmTpw4ceJEb29vqUMB6GQolAF0OydOnDh+/PiBAwdMJtM111wzbNiwfv36ubq6Sp0L7cFsNjs5Of3lL3/x8PBYu3atzWaTyWRShwKALsKBhbJer3dxcXFxcXHI1SiUgU5No9GsWbNm8eLFffr0Ydc+B8rPz9+zZ88XX3zRr1+/adOmDRkyROpEADoNCmUA3cUPP/ywf//+H374IS4ubsyYMQMGDIiIiJA6FNpPY2Pjxo0b+/fvP2nSpPLycrVaLXUiAOiCHPXDhdlslslkCoXCIVcTO2VHXQqAJLRarZub25o1ax544AEvLy+p43Qphw4dSk1NPXXq1IwZM2bNmiV1HACdAIUygC7uhx9+2Ldv3969eydPnpyUlDRs2DC+Ae1ujh07NmjQoLy8vIKCggkTJkgdBwDw+/R6vUwm8/T0lDoIgI4lNTX1/Pnzt912m9RBuqDS0tLPPvtsz549kyZNmjNnjq+vr9SJAHRcFMoAuqbc3Nxdu3bl5eX5+vqOHTt2zJgxDpxyQify2muvFRUVrV27VuogAIA/4MMPP2xsbFy6dKnUQQB0UH//+9+XLVsWHR0tdZAuaOvWrWlpabGxsXfe+X/s3Xd4FFXfxvEzm91N7z2kkEYvSpEOAgEB6R0URURQimBBxUaxPDyKgoWmIChIEzRUpQpIJxRpQUIIIZ303nZ33j/WJ/JCkABJZjd8PxeX1+zsmZl7Rg1nfzl7zhh+sQegXBSUAdQ0W7ZsOXXqVGJiYo8ePXr27GllZaV0Iijg/PnzR48eHTt2bEJCQq1atZSOAwC4N4cOHbpx48aAAQOUDgLARGVnZ//0009jx45VOkiNtXHjxu+++27w4MHPPfec0lkAmBwKygBqiKysrM2bN//www89e/YcOHAgoxUeWjqdTqVSvfzyy2+99Zavr6/ScQAAAFCF5s+fP2XKFOZJryKrVq1au3bttGnTOnXqpHQWACaEgjIAs5eenv7jjz/u3bv3+eef79mzp1qtVjoRFLNz587MzMyhQ4fyoQIAzFpCQkJ8fHyrVq2UDgLA1CUmJi5evHj27NlKB6mx8vLyFi5cmJ2dPWvWLD5qATBSKR0AAB7I119/PW7cuNDQ0PDw8D59+tDFeZilpqZGREQMGzaMajIAmLu4uLjw8HClUwAwAz4+PlSTq5Sdnd0bb7zRu3fvHj16HD9+XOk4AEwCBWUA5mrLli1Dhw718vLauHFjz549lY4DJRUVFW3fvt3W1vbtt99WOgsAoBIEBgZ27txZ6RQAzMnzzz8fHx+vdIoaq02bNrt37z548ODq1auVzgJAeUx5AcD85Ofnf/jhh25ubq+99prSWaA8WZZ79er1yy+/sAAjAADAw2z37t0dOnSwtLRUOkhN9u233xYXF0+aNEnpIACUREEZgJmJiIhYtGjRlClTmjRponQWmITS0lKNRqN0CgBAZUpJSfn999+HDx+udBAA5qS0tDQyMpKPCVVt1apVHh4e3bt3VzoIAMUw5QUAc7J9+/a1a9cuW7aMbiKMY5OnT59ONRkAap78/Pxdu3YpnQKAmdFoNOfOnfv222+VDlLDPf300/Hx8d99953SQQAohoIyALNx8eLFxMTEuXPnKh0EpmL27NljxoxROgUAoPJ5eXmNHz9e6RQAzM9TTz0VHBycl5endJAabsyYMUlJSXv27FE6CABlMOUFAPOwZ8+enTt3/ve//1U6CAAAAABADBky5LvvvrO3t1c6CIDqxghlAGbg0qVL+/fvp5qMm82fP1+n0ymdAgBQJdLT0999912lUwAwV/369TMYDEqnqPmmTp26dOlSpVMAUAAFZQBmYPny5VOmTFE6BUzI7t278/Ly1Gq10kEAAFVCp9NFRUUpnQKAuerRo8fJkyeVTlHztWvXLjs7OyYmRukgAKobU14AMHXh4eEFBQUjR45UOghMSGpqqoODg6WlpdJBAABVwmAwZGZmurq6Kh0EAPBvduzYERcXN3bsWKWDAKhWjFAGYOo2b97cu3dvpVPAtFhbW1NNBoAaTJIkW1tbpVMAMFdZWVmMUK4ebdq0OXLkiNIpAFQ3CsoATNrRo0c9PDwcHByUDgITkpubO2LECKVTAACqUGZmJl9OAnDf8vLy5s2bp3SKh4KDg4MkSVeuXFE6CIBqRUEZgEk7ffp0q1atlE4B05Kfn9+8eXOlUwAAqpBarQ4JCVE6BQBz5enp2b17d6VTPCwaNWp06dIlpVMAqFbMoQzApM2YMWPkyJF169ZVOggAAAAA4FZbtmxJT08fPXq00kEAVB9GKAMwaXl5eX5+fkqngGlhUjwAqPGKi4t3796tdAoA5kqW5U8//VTpFA8LFxeX3NxcpVMAqFYUlAGYrsLCwoKCAhsbG6WDwLTEx8evWrVK6RQAgCqUn5+/cOFCpVMAMFeyLO/du1fpFA8LW1tbCsrAw4aCMgDTVVBQkJ6ernQKmBxnZ+fHHntM6RQAgCpkY2MzadIkpVMAMFcqlerdd99VOsXDQqvVJiUlKZ0CQLUqfw7l8PBOSoQBgP8nO9uwbl3RuHGMUAYA4N90777WxsZb6RS4u+Tkw0ePTlc6BQBUppQU/Y4dJc88Y610EACVT6Oxe/LJbbfvV9/pgP79V1RxJAC4i4SElLNnV/Tv/6bSQWBaUlMzTp26+MQT7ZUOAgAmYdeu6bKsVzpFJcvLy1u0aNG0adOUDlLJDAadt3eLVq0YfA1ULYPB8Oqr/50/n9/fVIfo6OuxsRv7939F6SAAKl94ePnrbTLlBQDTJUmStbWl0ilgclJS0rdu3a90CgBAFSopKTly5IjSKQCYK1mWz52LUjrFw0OytORTG/BwoaAMwHTJshwTk6h0CpgcPz+vMWMGKJ0CAFCFHBwc5syZo3QKAObKwsJi06avlE7xsJBlQ1wccygDDxcKygAAM+PoaP/oow2UTgEAqEJqtbpOnTpKpwBgxgoLi5WOAAA11h3nUAYA/LsNG3bMmbNUCGFjY+3v7x0W1nrEiCctLbVVdLnFi9etXLl5z57vrKz+/kLZCy+8P3/+W7a2fy9aOHbseyqV6ptvZt1yYG5u/ogRr0+d+kxYWBvjnqKi4rKTCCEmT/4oIMDn9defu6c8M2cu2Lp13+37Q0MD1qyZe2/3do+uXIn95Zc906aNqdKrAAAUlJWV9corryxfvlzpIADMkl6vHz789T17vqvSq2Rl5YSFPW+cqc/Nzblhw5AXXhhct25glV60ghYuXLN9+4GtWxfdvHPLlt9nzVpYbvvff19hb29bXekAmD0KygDwQD744GUhxKlTFxcuXHvmzKWqW/rj8cdbLl264dSpi23bPiqESEvLPH068siRP41l4ry8/HPnoiZPfur2A62tLdu0ecTX19O4PsnQoa8++2y/Pn06lzVo0aKhp6frveYZOvSJ9u2bCSHi4pIXLFg9btyQoCA/IUQ19ESLikquX+dbdQBQw0mSpHQEAOZKkiR/f+/qudbgwd0fe6xxYmLq+vW/TZjwwfr1n7u6OlXPpe9VixaN5sx51bj97rtfPv54y7IRJyxdA+CeUFAGgAfSvn0ze3vbnj07ODrarVgRfv16or+/T1VcqF69IA8P18OHzxgLyvv2HTf+09gLPHLkT71e36FD85sPkWVZkiS1Wv3OO+ONewwGw7VrCbec+dln+99Tkhs30l1dnRo0CGnQIEQIce7cZSFEs2YNWrRo9MB3WSF16gTMnDmxeq4FAFCEk5PTd99V7dBCADWYSqVavvyj6rlWw4YhXbq0FkI0a1b/mWemnzhxrkePDtVzaWNvv+Ltvb3dvb3djdvvvfdlYKBvWUEZAO4JBWUAqByPPFJPCJGWlhUVdX3lys3R0XF2djYdOjSfMuVpW1sbnU7XuvWI9u2bxcQk3LiRHhTkN2HC8HbtmhmP3bBhx8qVW9LSMn19vYYN6zFgQJgkSRs27Pjkk+8+/njqF1+sTE3N3LTpq44dmx86dMo4N8X+/RGOjvaHDp3W6XRqtfrQodMBAT4BAT7Tp8/Lyspp3LjOhg07XVwc58x5Zfjw14UQ8+a91aFD8yeeGCeEmDVr4axZC8eMGThhwoiuXcdkZ+c+/XSfqVOfOXbs7OzZi156adjixeuysnKbNq07Y8YEDw9XIcTOnYeWLFmfnJwWGhpgbD9oUPc7PYpyb2fPnqO3PxYhxPPPv9emTdPz56MiIi7Y2dkMGfJEYWHRpk17S0t1Xbq0mj79BY1Gc8v5LSwsGEMBADWbXq9PTU318vJSOggAc3XgQETHji2q84o2NtZCiJKSUuMUc19/vfq33w6WlupCQ/1feml48+YNdTrd3LnLd+06otfrH3mk3syZE52cHO6183xLb3/DhvlFRcWLF6/bvftIWlqWv7/38OE9Bw7sZoy0cOGan3/eLUmiX78uEyeO/Jfqc0TE+S+/XBUVFevu7tKlS6tx44bY2FhfuRL75Zc/XrhwxVg3f+210QEBPsbTXr+e5O7uvH37H3q9vmnTuunpWQMGTE5NzWzUKHTmzAleXu7V+OABKIBF+QCgcsTEJAghPDxccnPzGzUK/c9/pr7wwuBff/1j0aJ1ZW00GvWSJTPWr/+8Xr3AV1/9xDhYeMWKX776avWoUX2WLJnZt2/nTz/9buPGncb2BoNh48adS5bMXLfuM09Pt06dWsbFJcfHJ+fnF5w4cf7115/Lzc0/fTpSluXDh0+XDU8+ffpSQUHRhg3zvvhiur+/95IlM8sCrFv3mRDitddG//bbN6NH9xdCLFv2gY+PR1mDlJS0PXuOfvvt7JUr58TFJX/99WrjhB5vvz2/V6+OK1Z87ObmnJSU1rnzY3d6Dne6nX95LKtWbRk+vNeWLQtat266aNHatLSsdes+mzPn1e3b/9i4cdftl4iMvPrmm58/8L8xAIDpys7OnjBhgtIpAJgrvV5/p8mCK11xcYlOp4uPT/7669XW1lbGbxO+8cZnR46ceeedcV9+OT042H/ixA+jomJ37TqyYcPOCROGf/75m76+Xg4OdvfXeb65ty/L8tSpc7Zu3f/CC0O+/XbWsGE9yiagS05Ou3EjY/78t559tv+KFeE7dhy80y2cORM5adJHzZs3/OabWZMmjdy168js2YuMA70tLFTTp7/w7rvj4+OT33rrnx747t1HvL3df/75i0mTRh48eCotLfOLL6avXDknOTlt7twVVfzIASiPEcoA8EDS07Py8wsiIi4sW7axUaNQX18vX99/hlP99de1EyfOl73s2LGF8df106e/cPTo2bVrt0+ZMmrp0o1TpowaPPgJIUSjRqEJCSnh4XuNL4UQ06aNKSv4tmjR0NbW5tCh087ODnZ2Nt27t924cef+/RG2ttYZGdmdOrU0NnNysp86dZRa/fdPeOM4grK3hBB2djZubs7GPbVr19Jo/t/fBdOmjTF+FS4srM3evceMw5P9/b2ff36QEOL110fv23c8IuJC9+7tbn8ahYVFd7qd/v273umxPP74Y61bNxVC9O3beevWfaNH93dxcWrd2qlx49Dz56Nuv4pWq3F3d77Hf1EAAHOi1WqbNm2qdAoA5kqlUg0d2qN6rjVnzlLjSt1+fl7z5r3p5uYcEXH+8OHT69Z9FhzsL4Ro2rTen39e2rp1X2hogBAiJMS/adN6zZo1uO/O8829/WPHzkZEnP/88zeNw7EbN65T1szDw9U4TVyjRqG7dx85evTsnebiWLBgzeOPt5wyZZTxDBYWFm+++VleXn5QkN+8eW8Z25SW6t5+e35WVo5xVLWXl9vIkb2FEIMGdZ8zZ6mrq5Nx3r8ePdqHh++pyucNwCRQUAaABzJ48FTjRrNmDWbPniSEKCkp2bBh586dh9PSMg0G2fitt1uo1erQUP9r1xL/+iumqKh40aK13377k/GtoqISGxsr47YkSYGBvmVHaTSaNm2aHj58xs7OpmPH5hYWFt26tV21aoujo52jo32TJn93HwMCfMqqyQ/Cyck+KyvH2H2UZfnmtywsLMo95F9upyKPRaX6f9/Cs7W1Liwsvr1ZnTq133+fYWsAUJPZ2dnNmDFD6RQAzJUkSePHD62ea40fP7RjxxZvvfV5rVqexjVF/vzzLyHESy/NLmuTm5sfEuI/ceKIXbuOjB8/q3fvTmPHDvLycr+/zvPNvf1z5y5LktSmTTm/gbu5a+3q6mTs2N9OluVz56I0GnX37mONe/R6gxAiIyPbzs52377jP/204/r1JLXawjiYxlhQLnPLNBq2ttZFReV04AHUMBSUAZguSZK02lvnzzU1ixa9r9VqfHw83N1djHumTfssOvr61KnP1K1be82a7Tt2HCr3wOLikrLu2jvvjG/atG7ZW8bumvEJ3NJF69Sp5UcfLVGrLT788GUhRNeurefOXb5hw8527R4tK/LeUpa9b2WXfvzxxzZt2rt27fbHH2+5ePF6Ozub5s0blHvIv9xOBR9LReTl5cfHp9SrF3TfZwAAmLiioqIdO3b069dP6SAAzJIsy8uWbRw7dnA1XMvLy61u3cDXXhs9deqc3buPhIW10ev1KpVq5co5Nw/CsLLSarXaL76YfuTImQUL1gwf/vry5R/dX+f55t6+wWBQqVT3tDTfLWRZ1uv1w4b1GDWq7837XVwcf/hh06JFa6dMGdW6ddPIyKvvvfflLaNMborEfKrAw4X/5wGYLlkufxyrSalXL6hp03pl1eSkpNRDh05NmzYmLKyNn5+3h4dLuUfl5eVHRl4NCfEPCfFXq9WXLl11c3Mu+3PLr/1v1q7do6WlOr3e8NhjjYUQbm7OzZs3TE/PquCSI8buZkFB0T3dY4cOzceNGzJ37vLevSecO3d57txpd0p4p9up4GOpoGvXEhcsWPMgZwAAmLiCgoLvv/9e6RQAzJXBYFizZnt1XrF9++YdOjT/7LMVBQWF9esHGwyGmJiEm7vEdnZ/T23cps0jS5fOlmWxbdv+B+88160bqNfrT5+OvO/kKpWqbt3A8+ev3JzBzc1ZpVL99NOOIUOeGD68V+3atXx9Pf/lJAaD4b4DADBHjFAGgMrk5GRvZWV54sT5kBD/v/6KWbv219zc/IKCQuNQ6/37I3x9vfLzC5YvDy8pKX3qqd6OjvYjRz65cuUWa2urxx5rnJqasX9/xLvvjr/TnBUODnaPPlrfycleq9Ua93Tr1ubMmUutWzepSDyVShUS4r958+/16gUWFhYbZy6+q/Pno3buPLx06eyAAB9nZ8d/aXmn27nTYzGuhX2vXFwcO3Zsfh8HAgDMhbW19YgRI5ROAcBcWVhYfPrp69V80ddeGz1kyCtLlqyfMmVU48Z1Zs5cMGnSyNq1a0VFxaakpL/44rC//or58cetfft2zszMKSoq9vR0e/DOc/v2zerVC3rvva8mThwRGOgbGXk1Nzd/zJiB95T8xReHTp06Z8aMr/v371paWvrrr3+MHTu4Vi1Pb2/3yMir164lZGXlfPXVaiFESkp6SEhAZT41AOaJgjIAVCZra6uPPpoyd+7yn3/e1bhxnVmzJs2b9/2aNduffbafcdIx41oWdesGfv31u8a17yZPfsrT0/Wnn3Z8881Pbm5OHTu2KC3V/cskyJ06tXBx+aeq27Vr6/37I8qGPNzVjBkT3nvvq4kTP/Tyclu7dm5FZlu2t7ctKSl98cXZkiQ0Gk1IiP+bbz5ft25guY3LvZ07PRbjQn/3ysfHY8iQalplBQCgCGtr6yFDhiidAoAZM656V518fb1Gjer7/feb+vR5/Ouv31m4cO3ChWszM7P9/X369+9iXIIvNjZx6tQ51taWAwZ0HTCg64N3ni0sLBYsePeLL1bOm/dDSUlprVqexmvdk/btmy9a9P6CBWsmT/7IysqyWbP6xhHHb789bubMBcOHv+7n5/Xii8NcXR3Xrv21XbtmlffMAJgrqdwZcMLDO/Xvv0KJPADwj4SElDfe+OzHHz9ROkgl0Ol0rVuPmDFjQp8+nZXO8kAKCgrffnt+UVHJ4sUKrJU0YsTrGo26qKi4oKDIxcVRp9Pr9fp16z6v/iQAYDp27Zretu1cW1vfCrQ1G/n5+d9///2ECTVtCdbExANxcZtatZqkdBCgxhoyZKpWq7WwUMXGJvr6eqnVFgUFRT/9NE/pXDXZlSuxc+YsXbr0A6WDAKh84eGj+/fff/t+RigDAO7iypXYd9/9sn//rkFBvikp6ZcvxxpncK5+ly9fK1tyJDk5TQhRv375A6UBAGatuLh49+7dNa+gDKCqFRWVxMQkGLf/+ivGuGie0qEAoKahoAwAuAsrK8sGDYK//35TZmaOm5tT166tJkxQZl7Lli0bHT9+rmwVaUtL7fDhTyqSBABQpRwdHefPn690CgDm57HHGm/atLfspSzLbds+qmgiAKiBKCgDQHVQq9URET8pneI++fp6vf++SYwRGzas56VLMXl5BcaXtWvXevLJTkqHAgBUPgsLC39/f6VTADA/Tz/d99ixs8avsgkhvLzcn3qqt9KhAKCmUSkdAACAiurcuVVQ0N+ThFpaavl4AAA1VXZ29uTJk5VOAcD8BAbWat26SdnLdu0eDQjwUTQRANRAFJQBAOZk5MgnbW1tjJ8WevXqqHQcAECVMBgMKSkpSqcAYJZGjeprnDfZ09ON8QcAUBUoKAMAzElYWNvgYF+NRjNiBLMnA0CN5ezsvH79eqVTADBLAQG1HnusCcOTAaDqMIcyANMlSZJGw48p0xKRmRtbUJSl08myrFSGWs8OTDl1Malh6NJriUplsFerXbWaJo52HpYapTIAQI00bNgwrVYrhMjPz7e1tZUkqbCw8KefzHUdAuAhFJlXEJ1XmFmiKzbolcpg1z/Mw87WultbBbuL1ioLJ6062Namnr21UhmqB5/agIcQ/88DMF2yLJeW6pROgb8ZZPndi9esLFSWKslWrRbKFZQ9fb16+noV6RULIIQwyLpLuQW7bmT29HR53N1JwSQAUMMUFRVFR0ffvCcoKEi5OADuzYLohIxSnUoSblpNqUGx3prW1qZXv65CCAV7jHqD4VpB3snMXHdL7UtBNXmgNJ/agIcQBWUAwN3pDPKbF642dbCrU9NHWFRcE0chhPglMU0ny2EezkrHAYAaokmTJvHx8ZIkGV9KkhQWFqZ0KAAVsuhqYo5O341+0T9shRC7b2R+cy1pXG1vpcMAQKVhDmUAwN0ti01qYG9DNfl2A3zcdtzIuF5QrHQQAKghhgwZ4u39T9klMDBw4MCBiiYCUCF7UjNzdPrOfHPrNmEezhnFpb+nZiodBAAqDQVlAMDd/Z6aFWJLNbl87lrN4YxspVMAQA3RpEmTRo0aGbeNw5Pd3NyUDgXg7valZvlZWyqdwkT5WlvuT6O7CKDmoKAMALiLlOKSYFsrSwv+yiifj7VlWnGp0ikAoOYYMWKEu7u7cXjyoEGDlI4DoEJ0BtmLxYrvwMtSW2IwKJ0CACoN1QEAwF3oDIYbJayz8W/SS0qUjgAANUfTpk0bN25sHJ7s6uqqdBwAFRJbUKSiwHAHFirBDGkAahJ+3gMAAAB4UDm6kpiCHCHEmay0n+KvLI+NjC/ME0JsSLiyPDbSIOQCvW55bOS25GtCiOi87OWxkWey04QQf6QlLo+NTC4qEEKsj49aERsphBjyzCj3YX1cw9oLIS7nZi6PjTyTlSqE2JeasDw2MrW4UAixNj7q+9hLQojMkuLlsZG7b8QJISJzM5fHRp7PSRdC7L4R93NidHZpiSzE+ex0nczwQAAAgEqgVjoAAAAAAPN2JCN5ydXzQhKSEJmlxfn6UklIa+IvCyGELIQk/t4WQkjSF9F/GjeNO2VZSP9rIAshZLHa2LhN08VJUUuSooSQhJBXx1+W/ne5NXGXxf9e/Bj/l3FDFuKTqFPG7Zsbr4677KDR5pWUWKrV85t00MuyhyWrAgAAANw/CsoAAAAA7sfRjOSFV89ZWVhcK8i9eb9UVu4V4uZNYawf/3/SLW3LOVS+9TS3nPO2fTdv5+hKcnQlQgihLxlxYkeorZO1hfrNOs3cKSsDAADcF6a8AAAAAHBvskqLL+dlLom5kFxccEs12cRF5WedzUn77+VT0QXZSmcBAAAwS4xQBgAAAHAPrhXkfhX9Z2Ruhu624cbm4mxO2rRzh7y1Np816WBlYaF0HAAAAHPCCGUAJs3Pz0vpCAAA4B96Wf7k8slzOenmW002ytOVRhdkL4k5p3QQADBvKpWqVi0PpVMAqFYUlAGYtLi4ZKUjAACAvxUb9AuunruSX0MmizAIsS0l9rtrF/VmXhwHAAUZDIaEhBtKpwBQrZjyAgAAAMDdyUJM/vPAtYIcpYNUsg2J0Vfysz9u2EbpIAAAAOaBEcoAAAAA7u5IRlJSYb7SKSqfTjbEFOQkF9XAWwMAAKgKFJQBAAAA3N2h7TRlKwAAIABJREFU9KRiWa90iiqRXlK0KekaE18AAABUBFNeAAAAALiLdy4cOZ55Q5KUzlFltiTHWKksng2op3QQAAAAU8cIZQCA6ZJlOTM17b4Pz0hNrdQ4APCQSi8pSiouMLVqcuz6LQf6j6mss5UY9H/lZVbW2QBUmwfsLuZkZZWWllZqIgCo+SgoAwBMVFJs3GuDR/783Yr7O3z2i5P+M/nVyg4FAA8jF41lgU6ndIpb5UbF2AX5V+IJmfACMDsP2F1cs2Dx5D4D87Nr2lqjAFDVKCgDAExUTlbmjcTE+z78yrkLFAcAoFJsSb6WUVqkdIpb5UbF2AUFVOIJz2Sl/pF2/3/vAKh+D9hdjLl0Wa+rmVPDA0CVYg5lAIDy9Hr9uoXfHNqxsyAv38ff76kpk1w93D94cbIQ4vfwLb+Hb2nQovnbX31+8eTp5Z9+npqUbGNnG9qo4chJL3n6+QohZo2fGHX2fJcBfU8eOJiblT11zofL//uZwWBIio17uk0nlUr1w6Hflb5FADBjO1Ouy7IsVfacF/rCoqsr1ifvOVicnmHt5RE8ZrhXt475sfGnXp1VZ9JzCVt2Zv55UeNgX/fl5z07txVCFGdkXVmyKvXQCSEb/If2yY+NCxjapzLzCHl/WkIHN59KPCeASnRg26+bf1iVnpTi5O7Wc/iQpq1b3d5dzMnM/Hza2/ExMZJk4RsYMGjcmEYtWwghflr87abvV7Xp3vVq5KW0pOQew4ZkpqVdjDgphJjUZ6AQ4r3FX9Vt2kTpWwQA80BBGQCgvD+2/7Z99Vr/0JCuA/pGX4x0cHayc3LsMqDv3l82Bzeo37xjB3cfLyGEg7OTh493k9aPpcQnnDxwMPl63H/X/FB2koj9fzw5Ymhebl7TNq36Pvv0D59/Ye/o0GP4MJXKxGb9BACzopdlF61V5VeTi0tOTnlfX1JaZ+Joax/PuF9+vfDfBZ5hHVRWlkUpqVGLvg95cVTQ6GGRny2+vGCFZ+e2pTm5J156S6VW1399vNbJMfKzJXKpzrZSp7wQQjSwd6ncEwKoLMnxCd98OMfeybHX0yMyb6RqtNpyu4t2jo7WtjYdevYoKig8uvf3ua+9+dmGNa4eHsaTHN+7r+eIYRZqdafePeOvXou+EJkSn9Dv2actrW3cvL2UvkUAMBsUlAGYLkmSrK0tlU6B6pCfmyuEqNOk0YAxo8tqFu2e6Lb3l83+dUL6PvuUcY9vUOC0zz8RQhTk5X71zsxzxyNSk5Lcvb2N745/562mbVsbt7sNHrBy3pe2jg5lxwIA7o+FJHlb24jKXq8udvUveTHXW6+YZ+XpURCXWJSSZu3lIUlSSWa2EKLpR2/ZhwYKIdxaN4/75VchxJVvfyzJzG63dpGli5MQImBYn4ufLLKr7Ve5qbQWFpV7QgCVpTA3Twjh5u3V5+mRVjbWxp23dxdVKtWbX3wmhCjML/D0q7Xhm2UXI0526NXT+G6P4UOHTxhv3Pbw8flt3YaU+IRugwc6ubkqdFs1gSRJWq1G6RQAqhUFZQCmS5blwsJipVOgOnTq/eQf23fs3hh+IeJkrxHDHu/bu9yhcNejrvz0zbLzJ06WFv/9H0Z2RmZZQTmoQf3qTQ0AD4V8XempzNRKP23y7j9UGs2hYS8JIYRK5da62SNz3hZC5F2JkdRq28C/K8XF6ZnWXu6ywZC8+49aT4YZq8lCiPzridY+nhZWlfyL540JV3p71a7ccwKoFIH167br0e3Qb7umDhwaNqDfk08Pt7a1u71Zblb2hm+XHdm1tyA317gnK+2fX4gFN6hXjZEfFrIsl5SUKp0CQLWioAwAUJ6dg/0H3y3Zt3nb1lVrls2ZG3/12qhXJt/SpjA/7+PJr+hKSweOGe0d4Lfn5/BzxyP0Ol1ZA5UFK80CQOWzUqvzdCWVftrClNSgZ4bU6ttdl59v6eZiYfl3aTgv+rqtv49K/ffnlLyY67ZBAcVpGbrc/LIqsxAiJzLKrrLnuxBCOGr4ahRgul6a8W7rsK6blv8QvmLlmSPHZi1bfHubBe/POn/iZNigfg1bNL989vyva9br9Td1F1V8CwEAKgGfvQEAJsFCre42eMDHK5epNerDO3cJIdQajRAiP+fv0SXX/orKy85p2y2szzMjW3TqYGVjaxwQcccTajR5Obn/0gAAUBEWQhrhX7fST6t1csyPjdM6OdjU8i6rJgshcqOv2QUFGLdlvT4/Js4+KEBSqYQQhv99PSU/Nj7rbGRZs0o0OZgluQDTZTAYHm3XZsa3C4Mb1L/21+Wk2Ou3dBdlWb4QccrL33f066+2fLyTu4/3v3cXNRq1ECIvN6cabwIAagJGKAMAlJeamDRnymstO3cqLSnRleq8A/yFEJ6+vhqN5tQfB7/9+JPsjIznpr2iUqlOHPjDyd01KzU9Yv8BIURGyo07ndMvKPBq5KXP33jb1t6+2+ABwUyIAQD3q4Fd5S9V59amWcLmXfahQU6N62eeOe/WtqVdoJ8QIu9qrGuLpsY2BfFJhpIS2yB/SzcXGz+f6xu22dcJlg2GyE8XyXp9pY9QtpAkrcToRcBEGQyG98eMD25Y39HFOTk+Xmtl6erprtfLN3cXX587x8PXJyUuYdUXX2u0mr3hW4QQGTfu2F30DQ7688ixxbM+DmpQr3bd0C79+lbvPQGAuWKEMgBAeTlZWbb29r+tXX9g62+PtGvz4vtvG+fBGPPWNEdnlz+2/xYXFe3q6Tn2nTetbWy2rlpz/Ur0C++8ae/keD7i1J3O+dy0V/1DQ84ePXbm8BGVxN93AHD/ovOzbSwqeSRKyLhRXt06xqzaeHLq+zcOHtc6OQghitMySrNybP9XKc6LjhVC2AX6CyGafDBN6+R46pWZkZ8u8ujUWghhF1jJI5SdNVZ5+sqf3ANApcjPyXV2dzv0284tK9d41PJ59ZOPrW3tbukuCiEmzHwvsF7dveGbI/YfHPzC84H16l4686dery/3nL2fGtG0bauEa7GHd+zSlZbfBgBwO6ncb3+Eh3fq33+FEnkA4B8JCSlvvPHZjz9+onSQh11CYdHMS9efD/CqSOPYqKgju/be6yUGPPeMpbX1faWrkFMHD18+e67ct1w83DNulLPYlLu3d9cBFRqlciW/MDqvYEb9wAeOCQD3Y9eu6W3bzrW19a26S+TqSiee2ZdcXFB1l1CcLEQbZ8/ZDVpX6VUSEw/ExW1q1WpSlV4FUMTI4xfHBXlbVWCS4rTk5N0/byr3rTt1zIQQ7Z7o5hcc9MAx7yg+JubgrzvLfcvCwqLckrRKkoa+NK4iJy/U65ddS17VsmZ+Ye7Kldg5c5YuXfqB0kEAVL7w8NH9+++/fT9TXgAAKlNCTOzWlavLfcvN2zMtKaXct3qNHFalBeULJ07uWL+h3Leatmn155Fjt++v+0jTChaUAaDGs1drZtR/7KUz++7UIPXQifMfzL99v9bFqSQj6/b9rb+fb+3p/uDBUg9HnJ897/b91j6ehYnl/I3TbP4sx3oh5Z7K1kL9asijDx4JwF1lpKbdqbt4p46ZECKwXp0qLSgnX4+/Uyq/4KC46Ku371dr1BUsKANADUNBGQBQmdp2D2vbPUzpFLca9crkUa9MVjoFAJgxF41VqJ1TVF451WEhhHu7lp1/+7HaQwn3ti0q67puWmsnrWUFGgJ4UHUaN1p1pJzxbspq0amDCaYCANPEnJIAAAAA7sJZa9nIwUWUM1teTWCpshhSK1jpFAAAAOaBEcoAAAAA7u6lwMZOGsvlsZFKB6lkFpI0q36rZk6VMAUHAADAw4ARygAAAAAqZIB3UAsnD6VTVLLHnD2pJgMAAFQcBWUAAAAAFWJloX63bks3rZXSQSpNgI39rPqtlE4BAABgTpjyAgAAAEBF2ajVix7p/Ed6wlfRZ816RmVLlQUzXQAAANwHCsoAAAAA7oGjRtvbKzCuMP+PtIS0kiKl49wPL0ubPt6BVJMBAADuAwVlAAAAAPfspcBGLZ3ciw2GxTHnUooLlY5TUYE2Dq+EPKKWVCF2jkpnAQAAMEsUlAEAAADcjxbOnkKIOvZOP16/fKO4oMigjyvILdTrSmSDEELIsixJkrGpLAshCalsWwhJkoWQZFkWQpKkmxpIQjaIv4+ThSz+ty2ELP+9Xbbx97YQomy/6u+jxN9tLFUWViqLQFsHJ42lh6XN4FrBThrL6n5SAAAANQgFZQAAAAD3z11rPTWkqXE7Ki8rrjC3tYu3jYV6e/I1W7Wmk1ut9JKiYxnJftb2jR1dL+dmXsnPburoVsva7nB6UlZpcRd3P61K9VtKrJ1a09Gt1o3iwojMlAAbh4YOLpdyM6/mZzdz8vCysjmUnpRdWtzNw18IsevGdUeNZTtX75TigpOZN4JsHevZO1/IyYgtyGnh7Olhaf1HWmKurqSHZ0CxQX8sI8XX2o7xyAAAAJWFgjIAAACAyhFq5xRq52Tc7uVV27jhqrUq265j71zH3tm43dbVu+zAsgYeltZl2/Xsnev9r3G78hp7WtqUbTd0cGno4GLc7uDmY9ywtlA/7l6rau4VAADgIaVSOgAAwNRpVCoHtYXSKUyXLMtullqlUwAAACjJ19pSb1A6hKkqNRhqWTHZDoCag4IyAOAuPCy1KUUlBXxEuIOEohJvCsoAAODhZqu2SCouUTqFiUoqLnXQMD4DQM1BQRkAcHc9vVzPZucpncJEXcotCPNwVjoFAACAknp5ul7MyVc6hYm6lFvYy8tV6RQAUGkoKAMA7u4pPw+dLI5l5CgdxLTIQqyNu/FmqJ+DhjUJAADAQ62li30HN6dNiWlKBzE54YlpXdycHnWyUzoIAFQaPgADACrk9VDfL6MTNiWlS0JYW6gk5ZLIsjDIBguVkr8TtbKwSCwsEkK8EOhd38FWwSQAAAAmoqenS6Fevy05o1ivd7e0LDboFQyj1xssLJTsLlpaqFKLSiwtLFq7OHTz5NtsAGoUCsoAgIp6ObjWtYLi+IKizFK9ELJSMRITbxw8eHLo0J5KBRBCWKktnvB0rm9vo5YULK0DAACYloE+7u3dnK4XFKUVl5YaFOsuyrLhyy9/nDJllFIBjOtau7mra9tYu2kpvACoafi5BgC4B7VtLGvbKLxE9fn0jPNXYvt4Mw8dAACAyfHQajy0GmUz6PX6L06cpbsIAFWEOZQBAAAAAAAAABVCQRkAYH7s7Zm2GAAAAOWTJMnTk+HJAFBVKCgDAMxPbm6+0hEAAABgomRZTklJVzoFANRYFJQBAGZGpZJ8fNyVTgEAAAATJUlSw4YhSqcAgBqLgjIAwMwYDHJiYqrSKQAAAGCiZFm+cOGK0ikAoMaioAwAMDOSJHl7M0IZAAAA5ZMkqUGDYKVTAECNRUEZAGBmZFlOSmKEMgAAAMony/LFi9FKpwCAGouCMgDAzEiS8PJyUzoFAAAATJQkSfXq1VY6BQDUWBSUAQBmRpZFcnKa0ikAAABgomRZvnTpmtIpAKDGoqAMAAAAAAAAAKgQCsoAAPOj1aqVjgAAAAATJUmSlZVW6RQAUGNRUAYAmJ+SEp3SEQAAAGCiZFkuKipROgUA1FgUlAEAZoZF+QAAAPAvJEmqXz9I6RQAUGNRUAYAmBkW5QMAAMC/kGU5MvKq0ikAoMaioAwAAAAAAAAAqBAKygAAMyNJwt3dSekUAAAAMFGSJAUH+ymdAgBqLArKAAAzI8siNTVL6RQAAAAwUbIsR0fHKZ0CAGosCsoAAAAAAAAAgAqhoAwAAAAAAAAAqBAKygAAAAAAAACACqGgDAAwM5IkubmxKB8AAADKx6J8AFClKCgDMGl+fl5KR4DJkWU5LY1F+QAAAFA+FuWrTiqVqlYtD6VTAKhWFJQBmLS4uGSlIwAAAAAAymcwGBISbiidAkC1oqAMADAzkiS8vNyUTgEAAAATJUlSgwZBSqcAgBqLgjIAwMzIskhOTlM6BQAAAEyULMsXL15VOgUA1FgUlAEAZkaWDRqNWukUAAAAMFGSJFlbWyqdAgBqLArKAAAzI0mq0lKd0ikAAABgomRZLiwsVjoFANRYFJQBAAAAAAAAABVCQRkAYGZYlA8AAAD/gkX5AKBKUVAGAJgZFuUDAADAv2BRPgCoUhSUAQBmRpKEp6er0ikAAABgoiRJqlMnQOkUAFBjUVAGAJgZWRYpKelKpwAAAICJkmX58uVYpVMAQI1FQRmA6ZIkSaNRK50CpsjZ2V7pCAAAADBRkiT5+3spneJhwac24CFEQRmA6ZJlubRUp3QKmKLMzFylIwAAAMBEybJ8/Xqy0ikeFnxqAx5CFJQBAGZGkoSHh4vSKQAAAGCiJEkKDfVXOgUA1FgUlAEAZkaWxY0bGUqnAAAAgImSZTkq6rrSKQCgxqKgDAAwM5IkvLxclU4BAAAAEyVJUt26tZVOAQA1FvOmAwDMRq9eL2o0FkVFpQUFhcePnzMYZEmSNm9eoHQuAAAAmIQePcZpNGpJktLSMvv0mSBJUnFxyY4d3yqdCwBqFArKAACzkZKSJkmScTshoUiW5bZtH1U6FAAAAEyFg4Pd1atxxu2kpFQhRGCgr9KhAKCmYcoLAIDZ6NCh+c0vnZ0dnnrqSeXiAAAAwLSEhbVWqf4pdEiS9MQT7RVNBAA1EAVlAIDZePrpPo6ODmUv69ULbN36EUUTAQAAwIQMGvREQIB32cvatX0GD+6uaCIAqIEoKAMAzEaLFo0aNQo2bjs62o8cyfBkAAAA/MPV1bFbt7bGSdIkSerRo72Tk73SoQCgpqGgDAAwJ6NG9XVwsBNC1K8f2LZtM6XjAAAAwLQMHtyjdm0fIUTt2rUGDmR4MgBUPgrKAABz0qJFo8aNQ+3tbUeM6KV0FgAAAJgcFxeHbt3aSpLo0aOds7NDBY4AANwbtdIBAODfeHq6KB0BIqm4ZGdyZlpJyY2iEqWzCCFEybDeDh1ubHZw2nwuWukswk5tYaO2aOhg24P/VgEAwENsQ/yNK/lFmSWlSgcRQghdy0e8HJ0i6wa+aQLdRSGEs1YTbGs9xNdd6SBVQpIkNzdnpVMAqFYUlAGYtJSUDKUjPOxOZ+UuiUmqY2fta2Ppb2OtdBwhhBAujiLAR+kQf1MJOaNUF5lbsDUp/YumIRaSpHQiAACAapVaXPLi6aj2ro7e1tpgOxul4/yttYcJ/bK/QK+PLyoecuzC4kfruGo1SsepZLIsp6VlKp0CQLWioAwAuKOjGTnr4m+MDvBSOohJq2VtKYTwstJOPnNl4aOhSscBAACoPmnFpdMvXHs11Jdfq9+NdQsn+zfOXZ3bONhZSykGgHljDmUAQPkK9foNiWnDfD2UDmIeAm2sHnW2++F6itJBAAAAqs+imMT+3i5UkytCo5L6ebsuvJqgdBAAeFAUlAEA5buUWyhkWekU5sTTUnM8M0fpFAAAANWkQG+4mFPgZqlVOojZ8LDS/pmdX6Q3KB0EAB4IBWUAQPlSiktqWfHx4B54WGq1kkQNHgAAPCSuFxTVdzCNNTbMRwMHm+sFRUqnAIAHQkEZAFC+EoNcaGD0xL25WlDMFz4BAMDDI71Ep3QEM5NRqhP0FwGYOQrKAAAAAAAAAIAKoaAMAAAAAAAAAKgQCsoAAAAAAAAAgAqhoAwAAAAAAAAAqBAKygAAAAAAAACACqGgDMB0SZJkbW2pdAoAAAAAQPkkSdJqNUqnAFCtKCgDMF2yLBcWFiudAgAAAABQPlmWS0pKlU4BoFpRUAYAAAAAAAAAVAgFZQAAAAAAAABAhVBQBgAAAAAAAABUCAVlAAAAAAAAAECFqJUOAACoIeKir67+auGV85EqC6lWYO3hL42v07SxspEWz/448uTpD1Z84+DsrGwSAAAA5GXnrFmw+MzhI0UFBe5eXk8MG9y5Xx9lI/2+acvqLxdNnfNBw5bNlU0CAGaEgjIAoBLk5+bOmfJaTkZm0zat9Dpd5JmzBfl5SocS2enpebk5paWsOg0AAKC8Be/POnc8Iqh+PWd3t8vnzifHxSudSORkZRcW5JtCxxUAzAgFZQBAJbhy7kJ2ekbnfr2ff2uaECIjNdUUBgW/9tl/S4uLrG3tlA4CAADwsCvMLzh3PMI7wG/WssWSJBUXFhYXFSkdSvR79umu/fvaOTooHQQAzAkFZQBAJdBYWgoh4q5cLS0p0Wi1Lu7uxv3nT0TMefm1J4YOHvXKZCFE1PkLs16Y0KnPky+8/cbX7808uvv3Bi2aX70YKQmpziONRkx8yTco0Hjg4R27Nv3w4424eGcP9+Yd2w98/llrWzudTje6Q1dXL6+genXPnzgpZLlz/97bV68b9MKYAWOeNX6PclLfQc5urnPX/zh73MToi5FCiIW/bnJwcjqw7dfNP6xKT0pxcnfrOXxI9yGDhBCnDh7+ZdmK+OirVrY2j7ZrO3LyBDtHh9uv8u2e7Yo+XQAAALOn1mpUKlV2emZacrK7t7eltbWltbUQwtj18g8N+fiHZcaWz3UMc/Zw/3zDmj0/b1r+6echDRukJifn5+b5BQX2f+6Z5h3bG5tFX4xct+jbqHPnbe3t6jRuPOylFzz9fIUQs8ZPjDp7vsuAvicPHMzNyh449rkNS5Y2aNHs7a/mGQ98b8y4a5cuz1236vct27auXC2EmPzRzFZdOpfbXUy4dm3NV4siT5+RJCm0caMRE1/0Dw25/SoLt4VTlQbw8GBRPgBAJQht3LBW7YArFy6+N2b8ueMnKn6gRqMeOHZ0s45tzx45/tGkqfm5uUKI39b9tHDmh0KILgP6+QUH/bpm/ZyXXzcYDMZD0pOT01KSB70weshLY598aoRGozmw7VdZloUQ+zZv1ZWU9Bw+xMLC4vG+Tzq7uRkPSY5P+ObDOfk5ub2eHlH/0Uc0Wq0Q4s/DRz+fNj0tOTls8ICQRg0PbPv1i7ffKwt281Uq+2kBAAA8dDQaTftePQry8t4fM37Pz5t0Ol0FDyzIy+85Ymi3Qf2Tr8fPe/Odv/48a6wmf/ji5Ljo6I5P9mj8WMs/jx597/kX01NSyo6K2P9Hr+FDnnxqeJ9RIwPqhF6MOHUjMdF4YEzkX807dvD0823Usnlok0bG9uV2FzNSUz94cfLZo8fbdAtr0z0s8tTpjye/kpuVfftVbB3sK/uBAYDpYoQyAKASaLTa6V/N+/GrBUd27vnvlNcfbd92wsx3KjLXxPCJL/oFBwkhrG1td28MP7b397bdum34ZpmTm+vrn/7H2JVXa9TH9uy7dPpP4yp/GkvLNz7/1N7J0XiGNk90O7B1+4WIkw2aN9sTvtnG3r7jk72EEJ379dm3eVtmWpoQojA3Twjh5u3V5+mRVjbWxgM3Ll0uhHj10/+ENmoohJj72ptnDh+NPHU6tEk5VwEAAMADeu6NV1083H9ds275p5/vWL/x5f/M8g0MvOtRYYP7dx88UAgR0qjBV+/M3P1zeN2mTdYtXFJaWjpt9gyf2gFCiIA6Iavmf71vy/ZBY58zHjX+nbeatm1t3O4xbMiSDz4+sGX74PFjd234RQjR66lhQohGLVucP3Ey6uz5O3UXd6zfmJedM/TFcX2ffUoI4elba+3Xi3dt/Hng8+VcBQAeHoxQBgBUDic314mz3p+9bIl/aMjpg4d/+Pyrezq8TpPGQojEa9evR10pKijMSkt/ZdDwSX0GTuoz8NiefUKInMxMY0uPWj4313l7jRgqhNi3ZfuZQ0dSE5PCBvQt+wxQJrB+3XY9usVE/jV14NANS5YW5ufJsnztr8s29vbGarIQov6jTYUQcVdjyr0KAAAAHpBGoxn8wpi56358rPPjibGxX0x/754WT67T+O/uoizLl8+eE0J8PPkVY3dx1fyvhRDZGRlljYMa1C/bbtOti5Ob64Ftv2WnZxzbvTe0ccM6jRvdcvLbu4tCiJjIS0KIR9q2Mrap/+gjQoi4KzHlXgUAHh6MUAZg0gICvJWOgHsT1KDe21/Nmzpw6Il9B8a9+1bFDzR+7VGWZeNGcIP6A/83wMQouGED44aFhcXN+32DAhu1bH5y34H0pGSNRmOc7e52L814t3VY103LfwhfsfLMkWOzli3WWllK0j8N9HpZCFH6v8VhbrkKAAAAKoWzu9vLH8+aNW5C1LkLcVHR/nVCKnigXq8Txv6iLOt1eo2l5dT/fHBzA+McykYqi3/Gz6k1mm6DB/y0eOmCGbNLS0uffGp4uecvp7toaSmEkP7XZTTOwFZS8s9agjdf5aGlUql8fb2UTgGgWvGzD4BJi41NUjoCKurCiZPGTrbWylKr1ZaWFOt1OmtbWyFEdka6sc2N+MQ7HX76j8NCCL/gIN+g2iqVKikuvnad0KZtWjVt0yq0cYOmbVrZ3Xlmup4jhpWWlkadv9DmiW5Obq7ltjEYDI+2azPj24XBDepf++tyUux1v6Cg/JzcuCvRxgbnT5wQQgTUrfPATwIAAADluHLhYkF+vnHbztFRCFFcVKhWq7VWljmZmcaeZFpysl6vL/fwU38cEkL4hQSpVKpaQYGlxcWSSmXsLtZ7pEnDls29fGvd6dJd+/fTWllePHna07dWsw7ty21TTncxJFgIEXn6jLHBuWPHhRCBdegu/j8GgyE+PlnpFACqFSOUAQCVIDUp6T8vv+rk5upTu3bK9bicrOzO/XqrNRrvAH9rW7vje/cv0f5Hpys9vnffLQcu/fiTwPp1rl7862rkJTdvr9Zdu1jZWHfu32fPz5vefvb55h3blxQVnTp45INli28ecnKLJq0fq1U7IOFabK+RQ8ttYDAY3h8zPrhhfUcX5+T4eK2Vpaune//nR3/6yrRPXnmj7RNh169EXzx5OrRRw4Ytmt/pMwwAAADumyzLC96fnZeV418npCAvL+5KtG9g7dDGjYQQIQ0bXDx5+tNXpnn41jq+d1/ZUsxG235cl3jtWnpK6plDR9Qa9RNaiu1jAAAgAElEQVTDhggh+o8e9fV7s+ZNm/5Y18e1Vlbnj0f0f+6ZTr173enqdo4O7Xs+sfeXzT2HD1WpyhlaV253scewIbs3blrz1aLE2NjSEt0f2361dbDvMrBf1TwhADAbFJQBAJVAZWHRrke3CxGnos6d9/D2GjH5pSeGDBJC2NjaTpz97uovFx3b+3utwNphAwfsWL/h5gOLCgr2b96utbZqHdZlxKQXjdMfj3rlZWc3t/1bt+/fss3eyan+o00NBvlfri5J0hPDB588cPBO67rk5+Q6u7sd+m2nXm/wDao97KVx1rZ2TVs/NnH2+5u+X/Xbup9s7Ow69XlyxKSXpJtnwQAAAEAl0el0rcM6H9+7P/pipJOrS7dBAwY8/6xaoxFCjHnjtaVz5l4+d/5GUlLX/n23/rjm5gMtLFQHf91lMOjrPtJ0yPjnA+vWEUK0DusiqaQtP6w+tud3S2trn9oBTm5u/x6g57ChEb/v7/hkj3LfLbe7aG0r3v768zULlhzY9pskSQ1bNh85eYKLu3ulPhgAMD+SLJfzET08vFP//iuUyAMA/0hISHnjjc9+/PETpYM8jC5diglPSNX5eIR5ulTRJb5+b+bR3b//Z9Vyv+CgKrpE9ZtzOS68dcOyl+fOXT558sKhQ6e//Xa2orkA1GS7dk1v23aure0dv8YB05GYeCAublOrVpOUDgJUjku5BYtiEkf4elTR+ff8vGn5p58/89qU7oMHVtElqt/q+BuTgnzq2NkIIU6fvvDHH6cPHjyZnp5la2uzefMCpdPdjytXYufMWbp06QcVaAvAzISHj+7ff//t+xmhDAAPKYPBoFKp8vIK9u078eWXK0NDAxwd7S5ejH7llWc7dWq5bdu+eDdXL5+q+nhQgxkMhlOnIg8ePBkRcT47Oy85Oc24zCAAAAAghLhwIXr7kdMREedTUzPT0jKNO/fsWa50LgCoKArKAFDDFReXFBWVODranTx54fTpyN69H/fychs16s2iopKffpqXnJx26VK0Xm84duysEMLV1WnRojVHjpzp2LGFn7fnlZJSpeObnyFDXsnOzs3MzClbE7zcefoAAADwcFq4cE3G+cvGjqLxn+V+dxwATBYFZQCoIYqLS5KSUv38vEtKSn74YZOVleWzz/bfs+fIJ598N3ny0717d0pISLG2trSzsxFCzJ8/3dXVSQgREuL/+utjjhz5Mzs7VwiRnp6VlpYZFXV969b9to+37jSqT9UFnvTBzEkfzKy68yvCoNdbazXZ2Xm3zMW8fv1v/v7eAQE+3t5MugcAAFAhXQf261oTV8AbNKjblrSMpKTUm4cd9Os30c/P28/Py9fXy9fX07it0VC0AWCK+NkEAGZGp9PHxiYKIYKD/Y4e/XP79gP9+nVp3rzhq6/+t6CgcPHimXq9wdHRvm7dQCFE+/bNu3ZtYzywb98uZScxVpOzsnJjYuJjYhLatn0kJibe2KOVJMlgMPj4uDfp0Ey5uzRXKguL1as/nT174b59J3Jz84077exs0tIyT5++eO1aYlpapr+/t7+/T0CAt5+fd0CAt7+/j1arUTo4AAAAqkmXLq16N2vw0UdLzpyJLC3VCSG0Wu0PP8yJi0uOi0uOj0/Zu/eocdvV1cnPz8vX19PX19vf39vX15PRCQBMAQVlADBdyclpUVGxjRuHOjk5zJq1MD+/4JNPXo+MjJ4//4cnn+wUHOxna2sdFtbGWDtesOA941GWltrhw3uVbZedrbCwKCYmISYm/n9/EgwGQ1CQX+3aPnXqBJaNqFWrLRo3rvfGG2Mj7ezO5+Qpcd/mTZKkGTMmtmy575tvNsTFJUuS5OzsMGHCCOO7paW669eTrl9PjI1NOnz49Jo1265fT3JxcQwI8CmrL/v7ezs7Oyh9HwAAAKh8eXkFn3763XNPtFu06P0FC1Zv2/ZHSkqqg4Oto6O9o6N9o0ahNzdOTk6Li0uOj0+Oj08+duzPuLjk1NQM4+Bl41hm4z+dnOyVuyEADyMKygCgvNJS3cGDJ2VZdOnS6sSJcwsXru3Zs8PQoT02bdoTH59Sv36QEKJbtzY+Ph5CiMaN6yxb9qHxwMaN69zpnAaDwVg+vno1/tq1+JiYhKysnMBA38DAWoGBfm3aPBIY6Ovi4ljWfvbshbIsu7o6hYW1mTZtjBBi/5+XDVrtnc6PcpWWlE6fPm/ixBG9ej3+yCMNZs1acObMX+HhX5c10GjUwcF+wcF+Nx+VnJwWG5sYF5d0+XLs7t1Hrl9P0un0tWv7+Pl5+/sb//j4+noqcUMAAACoTHa2Nj0HdrOThHEkhxCGwEDfoqKi48fP1a8fZG9ve3NjLy83Ly+3li0ble3R6fRxcUnGgcxnz/61ffv+uLhkvd5gLDH7+Xkbp8vw9fW8eWQJAFQuCsoA/o+9+w5r6vweAP5mEnbYEJKQQSBskKmiiLj3wl211l31i1tbpa2jjlaruKtWa53VVpzFvVDBxZAN2QHC3jskvz9em/JTa1GBm8D7eXx8LtebmwMq3Hvuec9BOkh1da2RkUFOjjQ+Ptnf393JiRUZufvp0+RLl/bV1tY9ePDC25sPAGAyaZGR8xkMOwDA3LkTNC/v0cPn/eeXyRRica5IlCsSycTiXIkkj82ms9l0DocxfHgom023tbV8z8sNDSmWlmaDBvUKCfEHAEREbFEwaO5DerfdF6BLIJGII0f2VSqbAQCrVm23sKBOmDCoqqqmubmZSv3XomN4qxAY6KnZU1VVI5HkSaX5Mln+lStZUmlefn5Ri0YZNAbD1sGBRqHoddRnhiAIgiAIgrQFHOByGE5GBvBqf8KEwUQi0cjIYO3aqMOHz//883fZ2ZKkpIxevfxsbCzefjWRSIAX+S13VlXVwBYZMln+/fvPpNJ8ubyASjX+uyMzzDXbwvIUBEGQT4cSygiCIG2soKDE1NSIQtE7depKVVXN3LkT0tIEixd/P3Zs//nzJ0okebW1dbD0YP78iV99NZdMJpHJpG++WQBf/s4Lx7cVFZVq+lcIhXKRSG5rawkvLnv39p8+fRSLZd/KgBMS0hWKYk9PpwEDgmNiYnv29AEAbNu2LKakErW8+GA4XFCQF9z87bet6ekCFxeuTJa/ePH3Q4eGzJo1LjU1x8jIwMGB9v7TGBsburvzWq55bG5WwUYZMll+XFzSuXPXpdI8ExMjJvN1ipnJtGMwbC0tzdr5M0QQBEEQBEHajKbmYOPGxXDDzMxEoSiJi0saObLv4cPnKyqqZ88eZ2Ji9J6TGBsburpyXV25LXcWFJTIZIrcXIVMpnj27JVMpigoKGYw7P5ulGHDZNrR6bao0xqCIB8BJZQRBNFqxsYGWIfwrwoKSoqKSt3deXl5hcePX3J3dxw2rM933+1LTxf89NNqOzurxkYll8uE0/NiYg4SiUQAQFhYUFhYEDxD60dqVFRUCYVysVjTATnXwIDCYtmz2fbe3i5jxvRns+kfNAM6J0dy9+7TMWP6GxhQjh+/2L9/j6iotSqVatiwPvAANCauTbi4cAEADIbdhQu7y8oq4T+bbduOLF/+uYeH07lz111cOG90yvs3BAL+7WqUgoISmSxfKs0XieT37j2VyfLr6xs1XTL+3rBrt88PQRAEQRAEaWOWlmYLF06G28OHhz59mlxX12BiYjR8+ILAQM+1a+cVF5cplc3vX30I61RsbCz8/Nw0e5qbVTJZvlyukMkUaWmC69cfyWT5SqWSToeNMmw1rZk/dBmcNt+1IQjSHlBCGUEQrVZVVYt1CKCysjo7W2Jtbc5g2P3xx80HD54tXDiFx3NYty7Kysps06YIHA7n7e3s6ekMAIiMnK+Zbjdjxii48aH9y+D0vJbpY6VSyeEw2Gx7Ho81YEBPFsveyOiDL9qKi8vOnv3LwYE2bFifR48SDA0NjI0NyWTSTz+thgfg8fgPPSfSerD6o2/fwL59A2FPjMbGpt9/j3F352VkCG/fjhs8uBeHw2jFmf7x933CP231qqtrpdJ82CsjJuahVJovkyk0mWUmk8Zk2jKZNEND/Xb4FBEEQRAEQZC2ZGNjMXx4KNw+f36nQCADABQXl3/77Z4RI0InTx4WG/uCQtHr1s21NVfyBAKexbJ/YyFjdXWtTJYvlxfIZIoHD17IZPm5uQVGRoaa5DKbbW9ra/X+eR7acNeGIEhHQgllBEEQAABQq9U4HC49XZCTIx00qFdNTd26dVEMhu3KlV/cuPHoyZOkGTNGMRh2XC7d0ZEJL8J+/vk7+Fo7OytNrbEmm9x6KpVKLM4VCuWw/XHL6XksFj0w0IvNtrewoH7EJ6VUNhOJhF27fsvLK9y6dZlCUWxtbQEbMkyfPuojToi0FSKRAACYMmUY/JBOt7W2tsjOlnI4jIMHfy8tLZ8/f9LHTes2MjJ4e8EjbMcslea/fJkaHX1bKs2zt7cxMjJgMmE7Zjsm09baulW9VhAEQRAEQVpSqdRYh9BV6OmR4TUen88+c2Z7XV09HO598eIdc3Mqh0P//vuf/f3d+/fv8UGnNTIycHHhwnV1GkVFpbAps1yukMvzX7xIy80tZDLtYKMMe/vXtcwtR3wjCNKloIQygiBdS2NjE5lMKiwsiYtLsrW1Cgjw+OOPm2fOXF20aGrv3n7Xrz9qblYNHtxLT480Z044TByPGzdw3LiB8OXe3i6fHoNcrtC0PxaJcqXSfA8PnpWVOZtNb830vPcrL6+iUo1v3Hh8/Hh0ZOQCJyeWoyNz7NgBAIA3evIi2sPIyCA8/PW/sQkTBsXHv6qvbwDAeNSohSEh/kuWTC8vrzI2NiQQPrKE3MGB9kbX5uLicqk0TyrNl0jyHj16KZUqqqtrNO2YORw6jWbt4ED7iAckCIIgCIJ0ERkZwh9P/4UbhGY4Y0NfnwIACA0NDA0NhHsCAz0ViiIAwP37z86cuTZ9+qigIK+iolIrK/MPPbmVlbmVlXm3bq6aPWq1Gs76k8nyMzIEN28+kskUjY1NsIq5oKDkzp14mGVGU6MRpCtACWUEQTqn+vrGxsZGExOj+PjkpKTMMWP6mZmZhodH2NlZ7d27TizOy8mRMpk0AEDPnj4hIX5wlFlExDT4ciKR6OHh1CaRFBeXCYVyoVBWVlaZkJAOp+exWPZsNj0k5MOm5/0buVxBp9tKpfnLlm0dOrTPjBmjaDSrLVuWwYVpQ4eGfNxpjQn4pmZUcvIB6ppVVuRP/cFKpZoMHNgTbp84sS0nRwoAEItzly3bumTJjGHDQjIzRTY2Fpr5LR/H0pJqaUlteZNQW1snlSpksnyJJO/Fi7RDh86JxXmwCMXBwZ7JtIW55o/otYIgCIIgSGcSE/NQKlXMmRNeXl41eXS/PwHqmfZhlCq1MbFdUjGaSS0hIf6aQd/Xrj04e/avs2d34HC4xMQMT0+n98/3+zc4HO7vGgUfzc7q6lq5XJGYmPHkSWJs7AtY0dyyXQb8/f3tMhAE0UUooYwgiG5rbGwqKiq1t7cpL688e/YvU1PjiROHXLv2ICrqt4iIaYMG9SooKDYy0tfXpxAI+MOHN8BlWQEBHgEBHvAMn1IO/Lbq6lqxOFcolAmFrwuQ9fRIbDadw6E7O7PDwoI+dHrev8nOlgAAeDyHOXO+IRAI+/dHmpgY7d//DcyMt0klMtOAUtVc9unn6TqKGpscDChteEIjIwNvbz4AwNubf/v20aKiUgBAerpg9eodFy7szsmRZmQIe/XyMzX9mLuCNxgY6PP5bD6f3XKnTJYvlSokktzExMzLl+9JpXlkMpnJtHNyYtnYWDCZNAcHu4+oeUEQBEEQROfcuRPv6so1NzdNTMwcM6YfACAoyKtZrT6VkKVUq4loVVPrNKpUzSq1tV67j7/m8zlwY/r0UeHhA0kkUmNjU0zMw5s3H3/33cKkpMzU1Jx+/YI+semZkZEBn88hEgm3bj2JjFwAd7Zsl/HyZapMpsjLKwoO7qZSqWB+mcm0o9Nt4XwRBEF0FEooIwiiG9RqdW5ugUqlZjLtHj9OuH07Ljx8EJ/Pnjp1la2tZVTUVw0NTaamxjCR2rdv0JAhrxffjRjRV3OSNm/ypVar/04cy4VCOQ4HkpOz2Gx7NpvO4TBCQwPZbPuPKwF4J4FA9uzZq4kThyQnZ+7efXLBgkkAgO3bVxobGwIAPq7l7ntwjfTNSIT0yloXE1SU2ioPi8sXsGmtOPAjwdTtqFH9Ro3qBwAwMTHMzBThcLihQ0MOHz6vr08JDx9IJrfl/QmDYcdg2PXs+U8dSklJuVSan5tbKBTKHj9OkEjya2vrYGYZ1jI7ONjR6bZtGAOCIAiCIBjKzBQ5O7Ojok6Ul1cGBHiQyaTVq2dp/pSAww2yMb9fXBFm9TEDP7qg+8UVg2zNCR2bfzcw0AcAkEjEjRv/B/fY2Vk+fZqcni60trbYtOkggUBYsmQaiURskxndb7fLUKlUcnmBVJovlyvS0gTXrz+SyfKbmpRMph2DYctg2P2dZbZp20tZBEHaD0ooIwiidYqKSiWSfHd3XnNzs0JRtH79vsjIBc+fp0RFnRg7dgCTaUcmk3r08IErp37/fQd8lY2NxcSJQ+A2hUJup9jy8gph+2OhUCYSycXiPA6HDtPHI0f2ZbPpmsVlbUUuV9y5E+/n5+7qyj19+io8v7s779Ch9fAAmE1uJ1/zHVa8EqiA2s2kHd+lE1Cq1GfkhVOZNi4d+IWytrZYtuxzuN2zp09s7MuqqhoLC+rixd97ePBmzw5XqVRtclfQkoUF1cKC6uPzTzPxmpo6qTRPIsmXSvOvXbsvlebn5RXCzLKjo4OtrSVcHdkmhfkIgiAIgnSMhobGxsam8eOXTJ8+ytmZvXjx1H87cpy91SFx/o2Csv42ZqhK+T3UANwoKHUwoIymteX6yI9jbW0xe3Y43P7ii7GJiekqlaqurmHkyIXDhoVEREwTieQUip5m8PgnwuPxTKYdk2nXcmd1da1Umi+TKWSy/Hv34mFRs5+fe1OTksGwhTUKDAYaHI0gWgrd3SEIghm1Wp2cnFVf3xAY6PniReqvv0aPGdO/T5+AHTt+bWpq+vbbhTgcztDQAKaJ/f09fvttK3yhn597x0RYUVEtEslgB2ShUK6nRy4oKIEFyMHBvtOmjfz09sfvVFJSHh1929GRGRLiHxv7Uqlstre3AQCsXTsPHtDmWcL3+MGDuzNH/mO2zNXYoEapao+3UCqbm5uVenpaN75DKs2jUPTIZBKZTCKRiCTSOyomDIl4eV2DOZk0hWkTZI7Zwr2Ws7nnzZuQlpYDACgqKlu+fNtnn40YMKBnSUm5hUW7lA4ZGuq/MRm8uVklkeRKpfkKRdHjxwmnT1+VSPKsrMwdHGAhM43JtGOx7FE7ZgRBEATRQjExsQcOnPnll00GBpQLF3a3ZsDabJbd77lFR8T5FDxe79MuUxsaGysrqju4oZYaqOtq62EZbzupUzU3qdQDrM3C6dbt9y4fx9bWctCgXnA7JuagVJoPAJBI8g8ePLt69WwvL+czZ661x3BvIyMDV1euqyu35c6CghKpNE8mU4hEuQ8evJDJFFVVNQyGraaWmcm0pdPt2q9+CEGQVkIJZQRBOkJ5edWLF6m2tpZubo7HjkXfvv1ky5alFhbUX375IyDAIzDQ08rKfNascU5OLADA5s1L4KsqKqqMjQ3hzg6gVDbDzhUikUwgkItEsqYmJZvN4HDofD53yJAQDofeTjOLYSXp+fPXX7xI3bx5qUymIJNJLi4cAICm7BpDEY70BRyaqKauvh3yyfX1Dd99t3fz5qVtf+pPU1NTu/HASVlpJZFIMDTUJxKJOBygUk1MTY3mzJmgOQwH1NYUsq2eFl3Uai7NbWwsNm9eUl5eBQA4f/56TEzs4cMbTE2NsrMlLfO/bY5AwHM4DA6H0XJnXl6hVJovFuempGRfu/ZALM7V0yMzmXYODjQul2lvb+3gQGunlDeCIAiCIO9XX9944sQlPB4/c+YYc3PTY8c2f2gvtfH2VqNplnm1DeXK5o8IoKioJCUlJyEhvbi4rKKiqvuUYT16dPuI83y0x49f5qZkhocPaqfzm5Lw9voUktZ3miYSifASrk8f/z59/JVKJSxXv3z5rrs77+XLtJs3Hw8fHvpGFrgN2dhY2NhY+Pt7aPbU1dXLZAqpNF8my4+NfQH7ZlCpJt26uVAoFFjIzGSiqR4I0tFQQhlBkDajVDYTiYTExPTsbOnw4aFKpXLhwo1UqvHOnWsyMoSPHyfAdsbBwd369QuCJbe7dn0FX/v2GqgOUFBQIhTKhEKZQCCrqqpJSEiH0/PYbPr48QPZbDoccNdOGhubyGRSYmL6wYPnxo0bEBYWpKdHnj59FJzABkexaQ8yHu/cPr010vMUv29bpo3FqqaGU4N99uw5Xd/YVP/3vny1Wq1W710xE9vQWo9Ot4UdjefOnTBhwmA9PbJaDaKiTpJIhKiorwUCWUVFlbc3vwNq3mk0axrNOijIS7MHtmOWSvMKCorv3o2XSPIaG5tYLHsHB9iLmebgQGvzHjIIgiAIgmiUl1ddvHi7d28/PT2ykZHh0KG94fDqjzsbCYdzMKQ4fOCrjh79MzY2ITdXUVRUhsPhAAD29jZjfV2tTTu035rX4F5JSZkO6mYqFU2K+weRSIRj/eCHfD67oKCkqKgUAO7evaeysyUREdNYLPvq6tr2i0Ffn+LkxHqjxqigoCQ3t0AszoWFzFJpfk1NLZz15+BAg4XMTCaNSCS0X2AI0sWhhDKCIB8MJo7LyiofPHhuZWXWo4fP2bN/nTx5ZeXKL4KDu8XHJyuVzQQCnkgkf/vtlzSaNZwBrUkkOToyMQm7vr5BIHidPoYdkA0NDdhsOpfLCA72dXRkdkBGu66uXl+fUlxctnbtLnt7m3Xr5uvpkRcvngLLRYcPD23vALRNWVmllZW5NmaTAQAATJ48/MqVBxkZQk2+lUAgPH16Fuu4PpLmBmn//kilshkAgMPhTpy4nJSU8fnnY2JjX+Bw+MBAzw678n67HXNVVY1YnCuR5Eskuc+evZJI8qqqamA7ZgcHGptNd3CgtVUvPwRBEATpsoqLy7KyxD16+Fy/HqtSqWg0az098sSJgzs+kqlTV6anC2EeGf6uVqv5fDYmbXM9PHilpRUd/746xMBAf/Dg180x5s6dkJKSDTPOR49eSE8X1tTUqVQqsTjX3Z2Ha+dabFjI3HLuX21tnVSqkMsVEknevXvxUqlCKs2ztraAUz1sbCxgornNx5gjSJeFEsoIgryPUqkkEom3b8fl5hZMmzaypKR83rzv7Owso6K+lssVQqGMwbAFAISGBgwe3MvExAheW2he3k4thlspP79IIJAKBLLKypr4+KSSknK4DJ/LZfTtG8jhMDomj1lXV19X12Bubrp8+Ta5vODMme1EIiEiYjqfz4atbzsgBu0kFMo3bTpw5MhGrAN5n5kzx2zadLCqqgZ+aGeH/RCVNgGzxhwOfceOVXCPvj7lzz9vGhrqe3vzo6JOBAR4tKwm7hjGxoYeHk4eHk6aPbW1dRJJnkSSL5Hk3bjxKD1dWFpawWLRWCx7FosON1CKGUEQBEFao7q6Vl+fkptbsGzZVjiQbcIEDJLILZ04sW3q1JUZGSLNHjKZ3LdvICbB4PH4u3efFhaWfPnlZEwC0C1EIkGzpHLRoinr1+8nk0k1NbU//3yORrNas2bOw4cvGhoae/Xy1euQ1nAGBvp8PhveYWnAlmv5+UWZmaIbNx5LpXkAACbz9VQPBsOWxbK3te0kl/cI0sFQQhlBEAD7plEo5LKyylu3ntjZWQUHdzty5I+zZ//as2etkxPr1assMzMTmO7Zt28dbFDVMu+jDbN3GxoaYelxTo5UKJQLBFITEyMul8nl0n19XUeP7ken23RYMEqlMidHxuezr19/tHPn8e3bV5qbm86bNxFWZ1OpJmgxHcz4790biXUU/6Ffv+7R0beePEnC4XDm5qYeHk6bN/+8Zs0crONqe76+br6+bnCbw2G8epUVFOR19258fHzyxIlDsHo4ZGDw5sS/hoZGsThXLM4Ti+WXLt0Ri/PKyir/TjHbw0JmdGOAIAiCIG84derq6dNXLlzYY2lJPXduJ9bh/OPEiW0TJy7LyZHCD2k0Kz+/j+y58enCwwfevv2k/UYZd2JwhDWVarJ799dwj4mJ0eXLd2k0a1dX7vff/8zns8eM6d/BUcGWay33lJdXyWT5EkmeRJL34kWqWJxbWloBp3r8/TuNybRDvTIQ5D+hhDKCdDk1NXWGhvoJCekvXqSOGBFqbW0xduz/TEwMjx79vri4rLCwBD7XHTUq7LPPRpDJJABARMQ0+FoymdTB4w7s7f91DnJhYYlAIBcIpAKBNCdHWlhYyuUyuFwGl8sMCwvicBiGhu04qfmdEhMzHB2ZeDxuxIiFEyYM4vPZvr6uf/11EP4pVr0+tFNhYYm7O08nBjTPnj1eKMwtLi67ceMwAOD69UdDhsxdv36Rn5871qG1l2HDQuBGYKBnQ0NTWVkli2W/bl0UiURctWoWkUgkENq95/K/0dMjOzuznZ3/KT+pr28Qi/PE4lyxWJ6UlAFTzBwOncmkwUQzm01HvZgRBEGQLqimpm7nzuMGBpQlS6b7+rpOnjwUAEAkdvQV8vuVlJQ3NDSy2XSRSK5Wq3k8poWFKYbxhIYG1tU1YBiALsLjcW+vGPPycvbycobbYWFBmZkiAMCrV1n79p0eO3ZAv37dq6trO77xHZVqTKUat1wP19jYJJXmSST5Umne/fvPqqpqExPT7eysmEzNVA87JtPOwEC7/uMgCOZQQhlBOq36+kY9PVJDQ9PFi7eJROLYsf3v3Xu2ceOBRYumjBzZVyrNo1D04M/FX37ZZGpqBADg8Rx4vNeDNLTksXxubqFmWyCQwdwx/EWh6Lm5OaTbFx0AACAASURBVNrbW4eE+M+cOYbB6OiZflBKSjaBgHdx4a5evaO+vmHTpgg9Pb1bt47AP23XsX66Ky4u6eLFO5s3L8E6kFbx8nL283Nbv34R/HDgwJ69enX75ps9Dx++WLJkOtbRtS8DA/1Bg4Lh9sqVXzx7lqJSqerq6qdOXTl0aMjs2eFVVTXG7TOtsfUoFL03VjjW1zdIJHkikVwszo2OzhCLc6uqamAJM4tlz2LR2nvkJoIgCIJgSCiUX7t2f+HCKRUVVQEBHv379wAAtHwWqz3KyirnzPnmwoXdAIAxYxbn5xeFhXXHNiQ8Hh8dfbu2tg52BUFaQ6VS5+cXveeAwEDPwEBPuMj1f//7rLq6DgBw9er98+evr1s339PTWSSSs9n0Dgz5H2QyydHRwdHx/42TlMsVcKrHy5ep0dG3JJI8Y2ND2CsDDo52cLCHd9AI0mXh1Gr123ujo0NGjTqGRTwIgnwM+HQ3I0MUF5fYo4ePkxMrImJLRobg8uX9dXX1J09ecXfn9erlW11dS6GQ4eQE7VddXRsXl/jTT8d79fKDeWQOh87lMh0dX/+CLZsxkZMjzcsr7N3b79KlOzduPJ43b4K7Ow+rYHRRTo60E9RrX7x45/Tpq5GRC1xdu1wj7PLyKrFY7u3tEheXtHHjgRUrZoaE+OfmFtjbd1xjmQ9SU1MnFuf+/StPLJbX1TWw2XQWy97FhWNnZ8Vm26NGNIjuunlzTY8ePxoaYnMrjnyQvLwHMtnFwMCFWAeCdEJ5eYWFhSXe3i6bN//s6sodOTIM64j+Q2Nj04IF6w8f3qDZM23a6uPHt2Aa1GtxcUnu7jytHRytbXJyJFu2HG75V9lKxcVlzc0qGxuLjRsPPHjw/MaNw0VFpVlZEl9fVwpFr32C/UgFBSWwkFkiyZNIciWSPDwez2TaOTmx7OysYJYZ1SsgnVJ09IxRo+6/vV838koIgkBqtbq5WYXH4y5fvltbWz9p0tDExIxVq7aHhw+cNWtcbq4CAJypqTEA4LvvFsJHpiSS0fz5E+HLtfySqKioNCdHmp0tgb9qamrt7W2ampTe3s5jx/Z3dGS297Dg9ysvr4yNTQgK8jQ01N+y5dDw4aEAgKFD+4wY0RfDqHRRVpZYG5puf7qRI/v27u379de7fHxculoNC5Vq7O3tAgAICvI6c2Z7eXkVAODs2b/u3o0/d25nc7MqP79Iq54ZGBrqu7k5urk5avZUV9fCEuaiorKbNx+Lxbk4HI7NprPZdDbbnsOhs9l0bbuTQRAEQZC3NTY2qVSqsrLK5ct/WL78cwCATgx7KCoqXbBgw7lzP7XcqSXZ5L+7fjViHUXnp8nArl07T6VSAQAIBMK1a/cTEtIWLpxy5058Q0NjaGiANlyS2dhY2NhY+Pv/0+C7vLxSIsmXyxUCgezRowSJJLehoQkuhvu7ipnWOe56EOSdUIUygmippiYliUSUyfIfP050c3N0d+d9992+e/ee/vHHLhMTwz17Tjk7swcP7lVbW0cgEDpmcm6bk8sVWVkSmETOyZEQiURHRyZsu8HjOdjaWubmFqxcuf3kyW1YRdjcrHr48AUOB0JC/PfuPVVXVz9//qSOb83cmVy5ci8zU7Rs2edYB9KWTp26mpYmmD9/gtbW53YYuAyitrZ+6dKtLJb92rXz0tIEenpkLpeBdWj/rbS0QiSSi0RykShXKJSLxXIzM1M2275FllkHPguka0IVyjoEVSgjbevixdt7957+888oIpGoE6MpoOLissWLN5069SPWgbzPoUPnDAz0p0wZhnUgOuCjK5TfLy1NcPny3aFDQ9zdeVu3Hubz2Vped19VVSMW50okeWJxHpz7V1NTC6dGw19str2NDZodjegYVKGMINqruVlFIODv339WUFAyfvwgsTh3+fJtgYFeK1bMFAhkpaUVsNb4yy8nffPNAvgSzZQ83RoOkJMjFQikmZlimES2tKTyeA6Ojszx4wfyeA7as948JSU7ISH9s89GvHqVde9efHj4IADAl19OxjoundfQ0IjH4ztZNhkAMHnyUIkkb+XKH0eP7j9u3ACsw8ESXAZhYmJ0+PCGurp6eGG9ffvRL74Y26OHz61bT/h8Dp2upWl3c3NTc3NTX183zZ68vEKRKFckkj9+nHDy5BWRSM7jOTCZdhwOncNhcLmMDh5SiiAIgiAAAKVSefjw+cZG5eLFU52dOXBosA4pLi6bN+/b8+d3YR3If5g9O3zTpoPl5VVUqjHWsXRRrq5cTWe5Pn0C0tMFAICkpMyjR/8MDx/Us6cPrMHCOsx/GBsbeng4tZz4B7uuweRyQkI6DodLSsqAI6NZLHs2257FskeNMhAdpUX/9xCkiygvr3r27JWtraWHh9OhQ+cuXbq7a9caDofx5EkSk2kHALCyMj9w4Fv4c6VPn4A+fQLgC3XuJ01zsyo7W5yVJc7OlmZni7OyJBwO3cWFw2bTQ0MDHB2Z+voUrGP8R0lJ+c2bj729Xfh89p9/3nRz4wEAvL353t58rEPrPJqbmwcODMY6inbh4EA7efKHPXtOLl++bfPmpVp1aYsV+B8czmBpalICAGSy/NOnrx45sjE/vygxMaN7d28tv0Oj0axpNOuePX00e4RCmVAoFwrl0dG3BQJZQ0Mjh8P4O79M53AYOrpeBEEQBNF+TU3Ka9fu9+kT2NTURKWajBgRCgBoOZNWJ5SXV86eHQmn8Gm/r7+ei3UIyGuasX6enk4zZoyGDUkOHTqXmJgRGbmATrfRztT/213XamvrxOLXs6OfPk0Wi3Pr6ho4HAacGg2zzNpTaIUg74HueBGkHQmF8qwscWhoQFOTcunSraamxj/8sDwjQ/j06St4CThkSO+JE4cYGxsCAFavngVfZWior6NNFerrG7OzxdnZkqwsSXa2WCzOdXJi8XgOTk4OQ4eGODk54PF4rGP8f5qbVXfvxtfW1o0Y0ffu3afl5ZV2dlYAgMjIBViH1gn9+edNubxg8eKpWAfSjhYunPL8eer8+d/NmjUuKMgL63C0CMywf/75mM8/HwO/yyUnZ0okufPmTbx375larerVy1cn5oVyOIyWjS8qK6thijkjQ3D16n2hUGZlZQaPcXZmM5m2qAsKgiAI8unk8gI63Wbz5p9NTIyGDtUnEo0mThyCdVAfo6qq5rvv9ulKNhk6duyCq6tjQIBHK45FOgIOh9NU/CxYMCkrS6ynRwIALFu2VU+PvG9fZGlpRX19A41mjXWk72ZgoN+y8loz2AN2XXvw4LlYnAsAYLHsWazXUz3YbLqO5geQzk0Hbt4QRPs1N6vS0wVKZbO3N//x44Tjxy+Ghw8MC+t+4cKtpqamvn0DSSTi8uWfM5k0OMNKk2nS9VxDU5MyM1OUmSnOzBTJ5QqhUMbjsXg8B29v5/DwgVrbNTUrS3z//rOZM8fK5Yq4uKSRI/sCALp4p4L21tDQSCDgO3c2GfLzczt8eMOaNTueP09duBC1SXk3ExOjVateP0KztKT+8cdNc3Oql5fzsWMXfHxcvbycsQ6wtUxMjLy9XeB8QkguLxAKZUKh7OnT5D17TpSWVjo6MrhcJodD53KZXC5DG6bKIAiCILpCoSiOiNj82Wcj6HQbXa94UKlUI0cuvHPnKNaBfJipU4cvW7YNJZS1lpMTC24cObIxN7cAPrdYs+ansLCgL74Ym5ycaWpq7OBAwzrM9zEyMnijUUZ5eaVIlCsW5+bkSG/ceCQUyo2NDdlsOodD53DoLBadw6FjO68eQVBCGUE+WH19o0gks7Iyt7Q0O3To3PPnqfv2RebnFx08eDYkJMDbm0+jWUdETHd0ZAAAli2boXmh5kedTlOpVJmZoqwscUaGKCtLLJXm8/lsJyeWn5/b1KnDYMZcO9XU1N2+/cTOzsrf3+PatQfW1hYEAt7BgbZ27TysQ+sS9PTIWj5Do21t3rz0/PnrM2eu/eGH5RYWVKzD0Wru7jx3dx7cNjMzuXz5rpeXc2pqTmJixoABPXSuSTGdbkOn2/Tu7Qc/rKurFwhkAoEsJ0d6/fojoVBmYUGFmWX4u60tGsyCIAiCvOn589SYmIdr185TqVQ7d67pHD8shg9fcPnyPqyj+GBEInHXrq+wjgJpFVit5eBAO3Xqh+rqWvhIZvv2Y199NcfZmX3lyr1u3Vy1tnK5JSrVxMfHxMfnn5IFhaJYLM4VCuVxcUmnTl0VCuUt65c5HDpcaIsgHQkllBHkX9XXN5aVVdjZWeXkSM6fv+HlxR88uNehQ7+npQlXrJhpaWnm5cXv3duPQMDT6Ta7d6+Fr2Kx7LEOvI1lZ0syMoQwiSwQyJyd2c7OLE9P5/DwQRyOtg+UT0hIFwik48YNjItLzMmRBgR4thxpiHSMW7eeJCamL18+E+tAOtS4cQO7dXPduvXw8OGhvXr5Yh2Obhg5Mgw+eKDTbR8/Tnj8OHHkyL5nzlwzNjYcNKgXgaBdPXNaQ1+f0jJjDkuYBQKpQCC7du2+QCCrrq7lcOh8PpfNtnd0ZPJ4DpjGiyAIgmCpoKCkqqra0dEhNvbFoEG9YDd/rINqG0uXbjl27HvdXbZ/7Fj08OF9UJWADoGTogcM6DlgQE+lshkAIBLJHz1K2Lx5yatXWdnZkrCw7qamRliH2Vq2tpa2tpYtW+qJRHKhUC4Sya9duy8UysvLK7t39zY0NOByX4/3gF8BBGk/KKGMIK81NjaRyaQrV+6LxfKFC6fk5hZ88cXaQYN6RURMa25WeXu7wCeEixb9s2a/sy59kssVGRmijAwh/J3NpvP5HGdn9ogRfXUi2VFdXXv3bry3N5/BsDt16uqgQT0BAGFh3cPCumMdWhd1//6zDRsWYx0FBjgcxrZty9eti0pNzZ43byLW4egSU1Oj2bPD4ba7O++vvx4GBXkRCITffrvYv39PnRtA1BIsYQ4J8YcfVlfXwhYZycmZf/55MydHyuUyHB0deDwHHo/J5TJ1994bQRAE+SBxcUk7dx7fvHlJ56t+mD9//eLFU3Ru1VFLnp5OmzYd3LFjFdaBIB+DSCS0vJG3tbW8e/fpkyeJgwYFHzt2gUo1GTasDzxGh8DaZM2H9fUNYnFedrZYIJDduRMnEskNDQ3g7Ggul8HhMLS87weii1BCGemiKiurHz58YWFBDQryOnz4/Jkz1/buXefszC4sLIHfam1sLGNiDsGDnZ3Zzs46nL/4T8XFZTB3nJkpysgQUqkmzs5sPp/Tp08An8/RlR+uKSnZ5eVVwcHdzp+/UVJS3qdPAADghx+WYx0XArpmNlljw4bFZ85cW7Roo2YdA/JBNBW+zc0qGs0mLi6Jz2ffu/esoKB4yJDecKip7jIyMvD0dPb0/KdtdE6OJCdHmp0tuX//qUAgo1KNeTwHHs+By2U6OjItLc0wjRdBEARpY+fP33jyJHH79pVsNv3Mme1Yh9P2IiN3T5o0xMWF24pjtVe3bq5cLrOhoVFPj4x1LMinsrIy10x28fZ2uXfvaWVltYEB5ccfj4aGBvbs6YN1gB+DQtHj89ktqy4UimI4O/rBgxfHjkXn5RXCzDKHw3B0ZHA4dBMTY0xDRnQeSigjnV9xcVlGhtDNjWdmZrJ48fdKpXLfvkiJJC8tTTBgQE8AwKhRYTNmjIZp05kzx8BX6UoW9eM0NDSmpwvS0gSFhaWPHiUAoHZ2Zru4cCZMGMznc7Rqdcz7H6Uqlcq4uKTgYN+MDNGBA2enTh0OAJgxY1QHBoi8j1xe8NdfDzSlpl3WxIlD3N15w4bN37lzjaMjE+twdBWBgB87tj/cdnHhpKcLEhLSe/f2O3nyMoNhp2lbrOscHR0cHR3gSmcAQG5uQXa2NCdHcvHi7exsiaWluZmZMY/HcnJiOTk5WFtbYB0vgiAI8jFSUrIJBLyLC7e+vgGO9LCx6YTf0nfvPgGbBGIdSBvQ19fLzpa4uTliHYg2wuPxOjpt3tub7+3NBwCo1eqgIC+hUNazp09s7IvHjxMnThzCZNphHeDHg10yevR4nR9valIKBDK4MC429qVarVIoimG9Auy6hrowIx8KJZSRTqW6ulYmy3dx4eblFe7de8rVlTtlyvALF25JJPl8PgcAMG/eBPhToeUc1S5S8CWR5MEkclqaQCZTuLhwXV25gYGeU6YM0+YFaBJJ3ts7CwtLGhoaGQy7uXO/5XKZwcG+PJ7Dnj2o/FPr7Np1fP581OoBwDLbK1f2z5q1btKkoWFhQViHo/NsbCw0/7QcHGg3bjwOCvIqKSm/f/9Zv37dO9N3dXt7G3t7mz59XrfIKCuryMwUZ2WJY2IeRkWdaGpqcnZm83gOzs4sHo/VKZMRCIIgnc/9+8/On7++du18AACsh+iUoqNvk8kkzcNgXUcmk27efJydLRk1qgsNmm4llUqVm1uAdRSfBIfD9ev3ukGin597ZWVNXl4hk2m3fftRCkVv7twJul5wRiIR3yhhLigoEQikOTnSW7ee7N9/pqqqhstlODoyHR0dYJYZh8NhGjKi7VBCGdFh1dW1RkYGCQnp9+497d+/h7s7b82aHWQyefv2lQQCYciQEDc3LgCgZXWkq6tuL7b6IJWV1enpQphETk8XmJtTXVy4rq6cESP66miNpEyWz2DY3b0bv3//2fXrFwIAjhzZCP9IF6d1dQU//LAC6xC0y+HDGzZs2C+R5GkWQyCfLjjYNzjYFwBgZmZSUVEVHX171qxxt249MTIyaDm6pHMwMzMNCvLSfF5lZZWZmaLsbElMzKPdu082NDQ6ObGcnFh8PhvllxEEQbTNxYu3L1++d/jwBi8vvqaZfmeVmJh+796znTtXYx1IW1q0aOrdu/FYR4G0OwpFb8iQ3nB7ypThDx++aGpqKigo/uGHX8aNGwAvOzsBGxsLGxsLTQlzdXWtQCDLyZFqBntwOHRHR6aLC5fFsufxHNBgD+QNKKGM6JKHD19UVFQNG9bnxYvUtWt3TZkyfOrU4RUVVRwOA9Yda1qUwm+OWMeLAak0PyUlOzU1JyUlW6VS02hWLi7cceMGurpytaqRxQepqKg2NTX6/POvWSz7b75Z4OPj8vvvO7AOCvlv585dHz06jEhEP2j+n3Xr5h8/fvHbb/d8++1CrGPpbGD9CNy2sKCePfsXjWZtbm56//6zkBB/3f0e+B5mZiYt88vl5VVZWeLsbHFs7Ms9e041NDTCkaqwIIVKNcE6XgRBkK5IIslTKIoDAz1VKvXOnWsAAFRqJ29dWllZHRm559KlvVgH0sYIBLymiBXpImxtLcPDBwIA7O0pM2aMrqioAgBERZ0oK6tYunSGrg/zaMnIyMDLy9nL65/BHjC/nJ9feOzY0+xsibm5qZMTi8dzgOULnfLSGvkg6D4f0VIikVytBhwO/fffY/766+F33y20t7e5ceORt7cLnJJ34cJuCkUPAABnr3VZzc2q1NTslJSclJTslJRsKtXY3Z3n5uY4fvygTjDI9fnzFIFAKpXmeXg4bdmyFD4kQDkRnXDkyB9kMgllk99p2rSR9+49i4zcvX79Iqxj6bR8fFx8fFwAAI2NTWlpgsTEjK+/npuRIdLTI7WciN3JUKnGAQEeAQEe8ENYv5yRIbx48fbWrSI9PTKfz4YzV52dWQYGqMwEQRCkfdXXNxYXl0VG7o6MXAAAGD26H9YRdZC1a6OOHt2IdRTtZeDA2devH8I6CgQDsNsyAGDOnPFxcYkNDY3Gxobz56/393fvlKsPuVwGl8vQfCiXKzIzxTk5khMnLmdliU1MjJycHJyc2Dwe08mJbWpqhGmwCAbQrT6iFdRqdWNj06lTV5qbVbNmjbt//9mxY9FTpgzjcOhubo7+/u6wAHnDhsXw+C71NEylUlVWUlruaWxsqqysKS+vqKiorqqqMTU14vP9AgL6mJoak0j//KcuL2/32ExMGvD4Nu6sVF1du3v3CSKRuGLFTAsLKotFh92uu2bJuY5qblb169e9EzzSaD99+vizWHbjxv3v/PldWMfSyZHJpBUrZsLtpqamrVsPT5o0ZMCAnkKhjMNh/Nerddsb9csKRXFGhigzU3T8+MXMTJGVlbm7O4/Hc3B15To5sbAOFkEQpFMRi3O3bDm0ceP/qFTjX3/djHU4HWrDhv0jR/a1sNCNeQaNjbjaWtIHvSQqakdGRqWtrWW7BQUAAHi82sSkqV3fAvloFApZU9b29ddzXrxIAwC8epV16tSV8eMHw5qGzodOt6XTbTXDYPLyCrOyJNnZ4t9/v56VJdLXp3Tv7m1paQbLF7pUxqbLQgllBAOVldUJCelMph2bTY+M3P3w4YsbNw43NSnxeDwsQA4J8de0FUODdAHANTbiVSqVUqlsampualKq1SoyWd/a2pBGI7YcDqBWg8bGjo6trU706lXWpUt3V6+eXVVV062bW2hoAACAzaaj5si6qKqq2sKCinUU2o7Foh848G3//l9ER+9BLck6hoeH09Gjm6qrawEAly7dvXfv6e+//6RUKrtIrS4c9q2Z7yeR5GVliVNTc65deyAUyt3cuG5uPFdXjquro7m5KdbBIgiC6Kr4+OTAQE+RSB4RMb0zDYltpStX7hkYUHRo/rBKhWts/LDbDTMzKwDa/baLQAAAoISyDoBpVnidOXJkWElJOQDg2LELtbX106eP6sQX+TSaNY1mrbmwVCiKRaLclJSsU6euZmaKzMxM+HwOXBXn4sJFN/WdEkooI+2uuLiMSjVpbm7+6afj+vp6//vfZ3fvPn35Mm369JEAgHnzJsBF3yQScfr0UVgHq13k8oLk5My0NNGIEZPUakAiEUkkIoVCJhB0e8JsS8+fpzQ1Kbt3946LSwoI8CAQ8HZ2VnZ2VljHhXy84uKyefO+RYW3rWFpaXbt2sFRoxaePPlDp2+nqD1gxURExLQZM0bj8fjy8qpx4yIiIqYNGNAT69A6lIMDzcGB1r9/DwCAUqlMTRWkpmbHxDzaseNXMpnk7s5zdeW6unL5fA7WkSIIguiM8eOXjh4dFhjoGRoaiHUsGJDLC86cuXbixDasA2l3TU1NAOBarg1FEACAZlnYiBF9b916UlpaTiIRN2/+eeTIME27jM4KFi507/76KyCT5WdkiDIyhHfvxqenCzgcBp/P4fPZfD6nZRsNRKeh74BI2ysoKHn+PMXNzZHFsl+4cGNBQfHx41sAAN7ezrB3wciRfUeO7AsPptGssY5Xu+TlFb58mSaR5N29G0+h6Hl6Ont5uZqYGOHxneqZXlqagMNhZGeLz579Cw7Rmj07HOugkLaRmpqzfv1irKPQGSQS8erVA4sXf79x42ITE9R6rEPBJD6NZn3+/E6pNB8A8M03e/T0yF99NQfr0DoakUhsOYZFoShOSclOSxPcvPk4K0scHOxLp9t4eDi5u/PQAkYEQZC3Xbx4x9ravHt37wMHvunKizx27frtp59WYx1FRyASieXllWZmXffvGnk/c3PT8eMHwe2QEP/MTJG3N//8+RtqtXr06H4tFxl3VgyGHYNhBwsXAADZ2ZKMDGFamuDPP29SKHokEtHNzdHNzdHV1RFV1egulFBGPlVxcZmlpVlBQcnBg2fZbPpnn424ezdeJlP4+bkDALZsWaq5+Rw0qBfWwWqp0tKKhIS0Fy/SXr5Mw+Pxvr6uAQEeEyYMhqvkVCp1cXEnySZXV9caGRns2XMqLS1n166vXFy4P/ywAuugkDam6VeDtF5U1Fdjxiw+fHhDV74LxZCBgT6sw127dt7du/EAgOTkzJiY2PDwgZ14gt97wBoTOMi+uVmVkSFMSsq8ePHOxo0HbGwsPDycPDx47u5OqLU9giBIY2NTQkK6UCgbMqQ3zCJhHRFm9u8/4+fnZmVljnUgHQGHw5maGqvVahyujcfJIJ2Pptty9+5e0dG3y8oqqFSTX3+NHjCgJ5wU1RXweA48nsPw4aEAgKYmZWpqTlqa4MqV+9u2HTEyMoCZZXd3Hipe1i0ooYx8sJKS8oSEdBbL3tGR+eWXG2pr644e/b65uTk42Be2n584cYjmYFTK9G9qauoTEtJevkxLSEivqKjy9XX18XGdNm1ka8Y7bNmy7cGDhy33kEikixf/jIm5ERW1+/ffTxsZGW3dui00NDQg4J/U3vr1G1Uq1bffRgIAtm37USKR7t0bBQBITU09fPioSCQikUhubq6RkWvxePyWLdtkMjk8APr119+ioy9euHD+4z5fuVyxdeuRQYOChw4NmTRpCGqw21ldvHjH1NRIc9mEtN6ff0YNHjznwoXdFIoe1rF0XSQSETa+cHfnyeUFT54kstn0uLgkU1MjFxcu1tFhg0DAwxISAIYCAIRC+atXWXFxSYcOnVer1ZrkMroBQBCkqxEK5d9+u+fAgW98fFwCAz2xDgdjmZmixMSMgwe/xTqQNjB58mfl/3+4ebduPhs3rv/ssxk9e/aYN28OAGDevAUbN643NzfXJJTDwydOnDh+7NgxeXl5s2bNXbfu6+7dgwAAMTE3zp//o7Cw0MjIaMiQwVOnToZv0bt3L3gqaPnylUZGRvBODems7O1tvvxyMgBArVZTKHqnTl1ZvXp2UlImkUjoUoOjSCSitzdf0wNELi9ITc1JS8u5dOlOXl6hq6ujm5ujp6eTq6ujgQEF62CR90EJZaRVFIriM2eusVj2o0aFxcQ8FIly4UT4rVuXwZQx7MiOdZg6ICND+PTpq2fPXimVzRYW1G7dXIcP7/MRFXA2NtYzZ36u+fDtrspqNdi5M2rfvj1U6vsKJYqLiyMjv/PwcP/qq9XFxcVyeW7b9tZITc2Ji0v64oux5eVVc+aEw54nKJvcie3ffzom5hDWUeiqq1cP9Ov3xZ07R7EOBAF4PB7WmgEATE2NoqJOzp0b7u3tUlRU2kVqr/4Nh0PncOiwb1VBQcmrV1kpKdkJCRkvX6b5+Lh06+bi5cVnYbVm5gAAIABJREFUseyxDhNBEKQdyeUKOt02NTV748b/dZGxrv9p48YDmzcvwTqKNhMQ4B8W1lfzobn5mz/6Kyurtm/fuXr1KkNDfSLxX5MqcXFxUVG7x40b4+7uLpFIDAxQrRUCYIX71KnD4baeHnn37hOzZ4/z9nbJyhLDNEuXQqfb0Ok2Awf2BADU1zekpuakpubcvh0XGbnb3t7G09PJ09PZy4uPOmNoIZRQRt4hJ0dCpZpYWpotWrSpqqrm2LHvKyqqWCx72GN+ypThmiNRAXJrVFRUP32aHB+f/OzZKxrN2t/fY+7cCe7uvE85p4GBYa9ewe8/pry8fOfOXe9/0J2c/Kqurm7hwgWWlv9dGd16dXX1CkUxm00/efIKTMp87OeLY7NRYkKXoGzyp8Dj8Zcv7122bOv27auwjgX5h4sLd//+yLq6egDAqlU7vL35ixdPxToorWBjY2Fj0x12xqisrE5ISE9ISL9w4XZVVY2Pj4u3N9/Hx8Xe3gbrMBEEQdrS9u1H9fUpCxZMgmu3EQDAmTPXhg8PpdNtsQ6kzdja2v7nrVZSUtLNmzeHDx/27/lkEB//1M7ODpYBtVw52vngcDgGo/P8A+hIfD577951jY1NAIDDh89XV9fu2xcJG0ViHRoGKBQ9X183X183AMCqVbOysyXJyVl37sTt3Hnc2NjQ09PJy4vv7c3v4uUd2gMllBEAHwTFxr7E4XBhYUHHj1988OD5N98sAAAsWjQFPiJzdmY7O7OxDlPHpKcL4uKS4uOT8/OLAgI8g4K8Fi2aamracUO32Gx2Vlb21avXhg4d8m/HUCgUAIBUKmurhHJTk1ImU3z55YadO1cDAL7/PuJTzqZWq8Xi3DYJDOkAz5+nurs7oo4Nn8LQ0GDq1BFLlmzpIjNtdIi+PgUA8MsvGzMyhACAI0f+qK6u/fLLyV1hrEprmJgYhYT4wxbqsDVWUlLG6dNXlcpmf38PLy9nPz93MzMTrMNEEAT5eBkZQj6f4+3tEhYWhHUsWqSysvrUqauXLu3FOpCO5u3tdeLEyW7dvDkczr8dQ6FQysvLq6qqjI07eXGlWq2WyRRYR6HDyGQSAGDbtuXFxWVwgv2BA2fXrp3H4XTFYR4asPPy2LH9AQBSaX5ycubTp8mHDp0jEgm9e/tzOPRu3Vzh3CkEEyih3EU1N6sIBPzx4xcLC0uWL5+Zni6Mi0uClaRTpw6fNm0kPKwLLrj4dM+fp8TGvoyNfWFlZdGzp09ExHQ+v+1z8Wq1uqGhAW4TCIR3rrQyMTGZPv2zLVu2enp6MBjv7m7p69uNwaBv2bJt4sTxgwcPhhmTt98CAKBWq94fUnJy5h9/3Fi0aKqJieFffx382M/sDWjShc54+TLt1KkrO3ag0tpP5ePjkpEhPHjw97lzx2MdC/IOcHzfjBmjL126U1NTV1NTm5YmgFW6CGRhQe3X73XlckFBSVJS+tOnr/btO21hQfX39wgM9PD2dsE6RgRBkA/Q3KyaMWPNihUzAQAom/yG/fvPzJ8/Eeso2phardLcB5FIpHe2BPTwcOdwOFu3/rh9+w9GRobvPM+AAQNiYm4sX75y6tQpwcE9W97XtHwL2K4QQQAAMD0aEOBhaUlVKpUAgE2bDvbq5du7tx/WoWGMybRjMu2GDesDAMjLK0xPF8bHJx86dE5fn9Ktm6u/v7u3t4uhIepB1KFQQrkLKSwssba2uHs3/tix6DVr5vD5bLVaPXBgMMxfwHl6cM011pHqHqWyOTb25aNHL2JjX7q784KDfadOHd6uz8rEYvHo0ePg9rBhQxcsmPfOwwIC/Pv27fvDD9t37PjxnQfo6elt27blxIlTx44dP3fuj3nz5oSE9H77LTQHv/Mkqak5bm6OGRmiyZOHte1nrUbXVrqjubl56dLpWEfRSUyaNPS77/beu/cUjTfUWgQCfvTofrDzXVxcUnx80tdfz6uvb6RQyFiHpl1sbCwGDAgeMCAYNtR69izl+vVHK1b8GBjoFRTkGRTkhepKEATRchkZQibT7vvvIxgMO6xj0ToikTwjQ7Rq1SysA2ljly9fvXz5Ktxes2bVv7W/mD79s5cvE3755ejixQvfeQCbzfrpp+1HjvyyefNWNpu9dOn/uFzu228Bde6eGMiH4nBeF4SNGBF648bj3r39cnIkOBwejUHWTPCCj/ek0vyEhLRnz1K+//5nNpvu5+fu7+/epYYcYggllDu57GwJj+cgleYvXLgxNDRgyZLpdnZW69cvcnCgAQCmTx+FdYA6786d+Li4xDt34oODu/Xq5bts2ecds96fRqMtX/568AWV+r678Vmzvli06H8nTpz8twNMTU2//HJ+ePi4vXv3bdv2o42NNZ/Pf+MtAADXrsU8fBj7xmsViuKpU1f8+OMKAMD48YM++dNCdJi/vwfWIXQq33zz5fLl25ycWGjeqZajUMhr186rr28AAPzxx3WZTBERMR2lld/J0dHB0dEBALB06Yz4+KT4+ORff422s7Pu1s01ONgH/hGCIIj2qK6u/fzzr3bsWGVgoI+G773TiROXFy6cjHUUba93716jRo2A2zQa7d8OI5FIK1YsW7p0WY8e/7pQicVy2LDhu7S0tJ9+ilq7NvKXX47AJaEt3wIAEBXV5XqGIK3k4eEEJ9sbGOivWxc1f/4kPz83rIPSIrByGQAQETEtJSX7+fOUfftOk0hES0uzwEDPoCAvY+N3LyBAPh1KKHdCycmZFhZUe3ub8PAIR0fm5s1Lzc1NT5zYamJipFmri3yiuLik27fjbt+O69PHf8CAnl99NbeDA9DT04Np3/9EoegtX7505crVZmZUFutfe5hYW1t9/fWaKVOmPX78BJ75jbeIj3/W8vja2jp9fQoej79160j7VbWr1QC1vNAJz569SkrKnDVrXCuORVrriy/GbthwYP/+983VRLQEfJQ4ZcrwmJhYhaLIyMggP78IXv0jbyORiMHBvsHBvgCAnBxpbOyLLVsOV1fXDhjQw9/fA33dEATRBsXFZcXFZfv3f4MWUvwbgUAmFufC8VmdDJVKbeWtFpvNmjbts59+2qVUNr3nMFdX12XLlixdujwlJcXf3+/tt0BPLJD/RKNZHzmysaKiGgAwffqaIUN6T5gwGOugtIu7O8/dnTdjxuja2rrHjxPj45Ojok506+bq7Mzu0cObxbLHOsDOBiWUO4mMDKFEkj9wYM/Tp68+eZK4cuUsAMDp0z/C1rpdc0Joe0hLy7lx4/Ht20/c3HhhYUFr1szWiQ4hzs5OEyeO/+23k+9JKGu05jOqq6vH4fA4HM7Kyry9vwBUKhripAOuXXvYu7cv1lF0Ni4uXGdn1sWLd0aO7It1LEhrDRoUDACor288dGj/mDH9e/f21YkfExhydGQ6OjJnzBitUBQ/fZp86NB5uVzRu7dfnz4B3t6tuplHEARpcytW/Lhq1ReoEOf9oqNvwe5PXdzQoUPi4p6+evWqNQcTCJ1zlq9aDTpy+HxXBr/OBw58c/PmYwDAkyeJ1tYWqA/GGwwM9DXzPFJTcx4/Tvj2273Nzc2hoYGBgZ6oIUZbQQllHVZcXPbo0cuhQ0NKSip27fpt0KBesPPmpElD4QHvHNSGfISGhsYrV+5fuXKPy6V7eDifPr1d53L048eHP3v24u395eUVa9dGhoT0MjMzu337blNTk6aH8juVlVWYmZm2nN3XrlQqVVVVdce8F/Ip1q6dRyCgrFnbi4iYNnDg7EGDgvX0UAsFXUKhkKOivioqKlWpVCtW/Lho0RRUE/GfbG0tR4zoO2JE39LSigcPnp8+fWXTpgOhoYH9+/fg8VA3DARBOkhNTZ1Mlj9+/EBUmPx+TU3KmJjYZcs+xzoQ7JHJ5Hnz5q1a9Y7B1E+exF29ei0oKBCHw/35ZzSNRnN1dcUixnanUqmqq+uwjqIL0denjBjRF14+bdx4YM2a2ah12L9xc3N0c3OcPTtcLlfAMdGFhaW9e/uGhga4u6NVcZ8EJRx1z5MniXg8PjDQc+/e06amRng83trafP/+b7COq3NKTs68cuX+/fvPhg0LiYycz2bTsY7oIxEIhBUrlh4//tsb+2tra6hU03Pn/lCr1Y6O3E2b1rPZ7Ldf3tysghtmZqYdEu9rarUaANTyQtvV1taVlVXa29tgHUjntHr17O3bj3Z8Xx3k01lZmQMAJk8e+vvvMStXflFdXatzDyMxYW5uOmpU2KhRYcXFZXfuxJ88eUUszh0woMfAgcEWFlSso0MQpDN7+TKturo2OLgbWlzyny5cQOXJ/2CzmQsXLiguLn5jP4lEqqys/OWXo/r6Bl5enjNmTO+YcTsdT61W4/Horg0DbDb9yJGNlZXVAIDIyN1Tpgxzdn7H7TwCAKDTbel02zFj+hcVlT548OLSpbvr1x8ICwsKCwtydGRiHZ1OwqnV6rf3RkeHjBp1DIt4kHcrLi5LSEjv379HSkr2yZNXZswYhb5NtLdr1x7ExSWVl1cOG9ZnwICeGEaiUqmLi7FsqlVVVUMmk95ZIGlpWd+u19tCoXzTpgNHjmxsx/dAPtnhw+f19fWmTBmOdSCd1qxZ61avnoXqDnTd+fPXS0oq5s4dj3Ugukcszr1x4/H167E8nsOwYSGw+TLS0s2ba3r0+NHQUFcfe3cpeXkPZLKLgYELsQ4EeVN9feNXX/20Y8c76kyRt82atW716tmdIAtTX4+vrPzUdWBKZTMA6rZdH0wgAAuL+jY8YbtKSxPs2XNy3z40+QNLKSnZV67cX716VklJOXoG3xqFhSVwMpaZmYmfn/vgwb3g4DHkDdHRM0aNuv/2fvToVauJRPLc3AIAwJIlW2QyBewyvnnzEpRNbj9qtfrUqSsjR36ZkpI9e/a4qKivsc0mY6u5uVmtVuvpkbFabv/OJ16IttHTI/fpE4B1FJ3ZxIlDzpz5C+sokE81btxAExPDgoISheLNCibk/Vgs+zlzwv/4Y9fo0f1iYh6NHr3ot98uwWIcBEGQtqJUKlE2uZXy84tqauo6QTa57ahra3Um+dseVCoVquvHnLs7b/XqWQCAe/eeff/9z1iHowOsrS0mTRp6+PCGBQsmlZZWTJ26aufO43FxSVjHpTPQ/3ltlJ9fBAC4devJ+vX7YT7tt9+2zpw5Buu4Orn6+sYjR/7o2/fz6uq648e3rlz5BYNhh3VQWKqqqlGr1TgcjkwmYRhGF/9b0AmffTYC9btoV/36dU9OziosLME6EORTTZo01NravKio9Ouvd2Idi04KDPTcuHHx0aObcDjc1Kmrtmw5lJ0twTooBEF03u3bT3buPI66ErXew4cvevVCi0X+QSQSKZQuPu4CZ29vjXUMyGtjx/b383NrbGxKTMzAOhbdwGbT58+feOnS3h49fKKjb0+cuCw6+lZjYxPWcWk7lFDWIk1NSqVS+fnnXx87dgEAEBTkdfToJjodpWk6wrFjF0aN+tLc3PTu3WNz5oR38Rm1KpUKAECh6GE+17GpqamgAJXyaTWFovjsWVQ82+4mTRpy+vQ1rKNA2gAOh/PwcBoyJKS6ura0tALrcHQSlWoyderwS5f2+vq6bdt2ZNmybc+fp2IdFIIguqqgoIRGs46ImIZ1ILokNvYlSii/gUzu0gnlxsbGoqIyrKNA/jFgQE8ymQSAeuLEZTU1aF5iawUEeGzZsnTnzjW1tQ3Dhs3/6adf5fICrIPSXiihrBVEInlExGaZTKFWg8jI+WvWzAEAoIfkHePPP28OHDgbAFxMzCE0WQIAUFdXD0fwkUjYD+1UKpuJRALWUSDv8+RJIlq/3wFGj+6XkJCGmsB0Gj17+hgZGZSXV27fjkZWfLz+/XscOrR+8uSh9+49XbhwI0orIwjyoWpq6urq6l1cuFgHokuqq2uUSqWHhxPWgWiXurr6rlzPiO7atJO3t8vOnWtKS8tLSsqxjkWX2NpaTp489MaNwxwOY8+ek5GRu9PThVgHpY1QQvn/2LvLuCiztw/gZwJm6O7uEAQFJCRMLOzCWrtFXTvXjnXt2l271g6MVWwUm5BSREBAAWlmyGHyeXH7zPLHAUfqTFzfjy+Y/oFznznnmnOfg1NmZk54+EOEUGFh6YQJg62tTRUUqFZWsKFKG0lMTF26dEdOTsGNGwcmTBiEO45EEAgEAoFElJIJXC6PQoGuiUTr0MFpzJgQ3CnkgoWFydOnMbhTgJZkbW1mb2+RmJhKfI0HmsbDo92iRRNnzBh5/vy/v/669ePHLNyJAADSgcWqHTZsnqWlCe4gUiYl5ZOKCsx8qo9MJnE4XNwpsOFyudjPbQUiGRrqmpkZ0WiKI0YsqKysxh1Hygwc2G3r1gU9eviePBm+fPmujIwvuBNJFjjm8SguLtPQUFu//s/Jk4cSawLiTiR3duw4np2dt2rVTH19bdxZfkigptYWX3fHxaW4utqpq1MREvflSCQBQqTWi8TjQUFZ0sEwrM0EBno+fRoTFOSFOwhoSf37d2WxamNj3+npacE3ys3h4mK3ffuSt29Ttm8/7uJiO23aCDqdhjsUAECiZWXlXru2D3cK6fP+fYazszXuFC1GQYHfIkMtVVUSn0+mUFps1EYiSdN5aRwOV3LmJIHvqaoq79+/8t27dCsrE319HdxxpExgoGdgoOerVwmbN//t6mo3YcIQTU013KEkAsxQbmtZWbmjRy8uKiqlUinHj2/y9++IO5HciY5O7tVrqoOD1d69K6WhmozIZLKSEq+1/x08eNTOTkddnfRTjyKRWrGaTMyYNjLSbdWXAM00Zcrq6mpYlqstBAV5PXkSjTsFaHl0Oq1TJ9dNm/7GHUQWdOjgdOjQOhcXu8GDw27efIw7DgBAcn3+/FVTUx2+eWqC9+8znJ1tcadoMRQKapHBlLIyn8UqbcHRGZ0uTWcvkUhIT08KRtbyTF9fx9u7fVVVzf79/+DOIpV8fNyOHt3o5uY0YcLy8+dhbxsEBeU2FRERhRCqqKjavn2xk5NNa1figEjHj1+9efPxjRsHQkK64M4iWUaP7ieB31VWVdVUVMCJOZKLwSgvKWEoKyvhDiIXqFTKgAHdXr5MwB0EtIojRzbgjiA7unXzuXPnUH5+8dy5m+HsTgDA9zIzc9as2W9oCLMWmoLL5bq4yE5BuQXNm7c5N1dO9+9iMCrYbDbuFODHrKxMDQ31YAucJuvatVN4+H4KhRwaujAp6SPuOJhBQbmNbN58KC+vCCHk6mpvbKyPO46cWrRoG5lMXr8+jEaT601467ly5X56eraRkR7uICKw2RxFRQXcKUCDNDXV4VzRtqSrq5WcLO8dF9m2adNfaWnZuFPIiKlTh0+dOmzs2CVRUbG4swAAJEtJCXPPnuW4U0ilsrLyT59yYA1lkXr27FxaysSdAo+amlolJTruFEAsw4YF02gKeXmFuINIseHDe+/evfzSpbtHjlzGnQUnKCi3ruzsvLNn/0UIzZ07dtKkIbjjyLWZM9eFhvYbPx423/sf168/pFIptrYWuIOIVlvLptOh+i+5Kiuri4vLcKeQI46OVrDFsGxbuXLGq1cJ5eWVuIPICFdX+/Dw/QkJHw4duoQ7CwBAgnh6tlNXV8WdQirl5OSbmhriTiGhJkwY5OpqjzsFHjU1LCgoSxEtLY28vKJFi7bhDiLFDA11168P09BQmzr1NxZLTqfnQ0G5dW3ZcqhLFy9iEXTcWeRa//4zt2xZ4OnZDncQiTNwYPeBA7vhTtGg2loOrG0nyf7998mlS3dxp5Ajjo5WHz5AQVnGjRs3QCBN2/BIgTlzxqiqKm3eDKtUAwAQQmjatDUfP2bhTiGtcnMLTEwMcKeQUNnZea9eyenSZCxWrYoKLIInTTw9223YMLekhIE7iHQbPrzXkiWTly/f+elTDu4sGEBBuVWUlZU/fPgKIfTXX2thgQu8BAJBWNimc+d2wEac39u69bCEn5ZVXV0DBWVJpqWl7ubmgDuFHFFWVjI01MvOzsMdBLSu+PiUNWv2404hU0aPDunQod2pU9dxBwEAYJaVlRsQ4GFvb4k7iLTKzS00N4cZyqJxubzDh+X0hBgGo1JNDabQSRklJfqXL1+Tk9NwB5FudnYWu3Yt27//HzlcUhkKyi2vsrJ60qSV3t7tcQcBCCE0adLKefPGwQzx712+fE9VVVlbWwN3kMaUl1fCCYmSLDi4s59fB9wp5Iu5uWFmZi7uFKB1BQV5BQV5pad/xh1EpvTp429qarhx41+4gwAAcLK0NBk3bgDuFFIsP79YT08bdwoJZW1tOmhQd9wp8GAyyzU11XGnAD/N3d1p9+5T6emwgUdz7dy5dP/+f+Rt2TooKLe8ysrqa9f2QQVTEmza9Nfo0SG2tua4g0iiYcOC58wZgzvFD1RUVMN33ZLs1auEgoIS3Cnki5GRXn5+Ee4UoNV16+YNH14trls3b3d3xx07juMOAgDAQyAQ/P33RdwppBuDUQHnfTaERCL1798Vdwo8ysqgoCytDh1aB/93LeLvv9eFhW3KzJSjtS+goNySnj6NOXTokqGhLu4gACGE7t9/QSKRevb0wx1EEuXnF6emZuJO8WMVFZVqajBDWXKdPn0DNuVrY4aGevn5xbhTgLZw7ty/d+5E4U4ha0JCuiBEunnzMe4gAAAMXryI//wZlo1qlrIyppYW1J4adOrUdflcQIDBqIA3hpQik8nl5VU5Ofm4g8iCkye3TJq0CneKtgMF5RZTUFCira0xbdpw3EHAN9u3H1+4cCLuFBJq794zUrEAv4qKsqYmFJQlV58+AWZmsI5emzIw0IGCspzo2zfo4cOXuFPIoIULJ0RERBUVwZdhAMgdGxuzxYsn404h3ZjMCpjM2Ag+nx8T8w53Cgz09LSgoCy9tLU15szZiDuFjNi9e9natQdwp2gjUFBuGXw+X0NDzcXFDncQ8M3Jk+Fz5oyh0RRxB5FEAoHA3d1RKpa+TUr6qK2tiTsFaFBISBdY5LqNmZgYKCoq4E4B2oKGhur27Utwp5BNEyYMOXUqHHcKAEBbMzTUheUamsnYWB/qho0IDe07aFA33CnaWlFRaX5+CXRQpZemptrGjfNycgpwB5EFbm6OpqYGd+8+xx2kLUBBuWX8+utWqVhAQH6cPHm9X79A3CkkFIlEGjGiN+4UYikuLtPV1cKdAjRo69Yj1dU1uFPIFxUVJfk8lVI+5eQUJCSk4k4hg7y8XLhcXmIi/G0BkC/9+8/CHUG6cTjc9+8zKBQK7iCSi0ZTpNNpuFO0tby8ImNjPdwpQLO4uNiZmhrgTiEjxozpf/fuM9wp2gIUlFtAZmbOwIHd3dwccAcB39y8+Xjs2BAyGd7eosXHf7hw4Q7uFD/GYFSoqCgpKFBxBwENiox8TSbDoKJNqaoqV1ZW404B2giVStmy5RDuFLIpKMhLTiaPAAAIubkF9vaWuFNIt5oalpISHXcKiUYikYYOnVdVJV/zLfLyCo2N9XGnAM01YcIKLpeHO4UsUFKiubjYysOOHVBxawFWVqbdunnjTgH+8/Dh644dnXGnkFzp6Z+Likpxp/ix0lKGu7sT7hSgMUePbqTTYWGZNqWoqEAikWpr2biDgLZgaKg7Y0ZoTQ0LdxAZ5OPjFhOTXFFRhTsIAKCNmJgY7NgB6wg1S01NrZKS3E2//Vn9+gXl5RXiTtGmKioqbW0tcKcAzWVkpPv+fTruFDKiWzeff/99gjtFq4OCcnOVljKXL9+FOwX4T20tOyUlAwqRjejRw3fcuIG4U/xYVlYemUzCnQI0xsQETozCACYpy5UuXbxgOlgr6dTJ9cmTaNwpAABthMmsLCwswZ1CurFYtXK4nsPPmjVrlJ2dfFVXY2LewTbdMmDLlgXt28Np9y3D0tKktpaTmZmDO0jrgoJyc0VHJ6urq+BOAf6TmJjaubMUbDeHkaammoaGFGyk9vnzV3NzI9wpQIPYbM6IEQtwp5BHZmaGMGVVfly//igu7j3uFLLJ09MlJSUDdwoAQBu5c+fp1av3caeQbmw2x9BQF3cKSVdUVBoT8w53ijaVmprp4GCFOwVoLharFs7cakH+/h3j4z/gTtG6oKDcXN27+yxZMgV3CvCfjIwvBgY6uFNItAcPXu7ffxZ3ih/78uWrmRkUlCUXny+AHfmwYDIrORwu7hSgjRQWlnz4ALv+tgoHB8uSEgbuFACANqKjoylv80ZbHI/HKy+vxJ1C0mlqqi9fvhN3iraTn1+spESXiulKoHHR0cnbth3FnUJ2ODvbZGfn4U7RumC3q+ZiMMpVVZUpFDj3R1KUlZU7OsIXpI3R0dH8+lUKFvbi8fhWVia4U4AG0emKt279iTuFPKJSKbBjhvwYMqQn/He3EkNDPRaLzWKxYS14AORBz55+uCNIPS6XR6HAbsw/oKBAXblyenFxma6uFu4sbSE9Pbtz5464U4AWYGioq6enjTuF7DA3N759+ynuFK0LZig31+bNh96/h/MlJcinTzmqqsq4U0i0Dh2cNm2ajzvFD7BY7Fev4q2tzXAHAQ0SCARZWbm4U8gjBQUqzFCWHzo6mnDaTetRVFQoLJSCXWoBAM2XkPAhLS0bdwrpxufzYYMTcXTp0klOqskIoefP39ramuNOAVqAnZ3F3LljcaeQHYaGurm5UjCNrzmgoNxc1tam2toauFOA/zCZFdramrhTSLqkpI88Hh93isakpGQ4OdngTgEaw2LVzpy5HncKeaSoqMjjwZRVeXHrVuTBg+dwp5BZHA63rIyJOwUAoC08exYXH5+CO4V04/P5CgoKuFNIgeLislWr9uBO0Uaio5O8vFxwpwAtgMEof/z4Ne4UsoNCITMY5UymLC8TBAXl5pozZ4ylJZyVL0HKyytVVJRwp5B0ERHPEhMleoX49+8znJ2hoCzRqFTKgAFdcaeQU1BQlh/KynQdHfiWtLUoK9MrK6v+YxFvAAAgAElEQVRxpwAAtIUePXw7dWqPO4V0k/D5KJJDV1dLSYn+8aPsb4GQkfFFQ0NNfqZjy7aCgpKoqFjcKWSKiopSeXkF7hStCNZQbq6YmGQ7OwsNDTXcQcA3bDZHURG+Of+BgACPmppa3Ckak56e3adPIO4UoDEKCgozZ4biTiGPSCQkEOAOAdpKt24+uCPIMn19bS4XFpABQC44OMAmK6DtrFw5HXeEtvDqVULXrp1wpwAtQ09Py9fXHXcKmWJmZshiSXTVpZlasqBcXV305cuLFnxCqbB//4ORI92trXVxBwHf6OlRsrPvFxVJUE3Z1ra3pG3b6OPjhjtCY/h8/pMnMWvWzMYdpD4+n5uW9i/uFJKCy+W9epXt72+NO4jcqaoqyM6OUlb+iDvIf0xNvVVUDH/2Uenpt3k8Tuskkh2VlbUcDk9LC/YGaBVVVVmfPzNTUyVoD24DA1dNTWhXgQgFBQkMRhbuFFLs2bNPamo0Nzc4tbTpPn/Or6z8mpp6HXeQ/ygpaZubB+BOIdqbN0mdOrniTtG6bt58vHv3ctwpRODxatPTI3CnkD7m5kiiDnBpx+UWZmQ84vMlaDcUMzM/ZWW9lnq2liwoV1Z+zcp6Ym7u34LPKfk6djRRVVXk8+HsY0mRlVUqEPD4fElZzuXjx1uWll0kraCMEDp8+NK4cQMlc2v7168Tvb0l8ZxEPp+dknLVwWEg7iASgc3mnD8f5+dngTuI3BEIBAIBX3I+d3JyXqqrmzahoPzu3SV7+xCEYHufxrx+nZWfXzFqVAfcQWSTQMDn8biSczQVFCRSKDQoKAORcnPf1NZWaGpa4g4irXJyGNraSnz+T39aASGBgC8QCCSnzWSxmHl5MRJbUCb2gRwzJgR3kNYSH//B0FDX0FASp9ZxuTUfPly3t5fZP35rKC6uevkyq3//driDyA4Oh8fn8ySnzfz8+ZmWlrWEFpQRQmpqxk5OQ1r2OSWckxPuBOB/0ekPnZwGKinRcQf5JivrMe4IopWUMCIj3/TuLYnfAL1/nx4Y6Ik7hWhUqpK8tXIN4XK5kyapOjn1wx1E7mhpfbKwCHRycsYd5Jvy8pwmP9bRcTCJJCnf/0kmBiO+qKjUyakb7iCyiU5n6ulZOzkF4Q7yDY8ny+dFguYzMfGS2NqZ5Js5M0hBgaqlBbupN1119TtNzSrJ6QkzmdlxcZ9wp2jQxImDIyKe4U7Ril6/TuzfX3L3U1FUVJGc96pUSE3NzMjIgD9aCxIIUs3Nuzg52eEO8k1ZWQs3mDCKa65z5/79+rUIdwrwHw6HJ4DlRcUwe/bodu0kdNe78PCHAQEeuFOAH6BSqaNGQTUZAz5fwOfDrjjywtfXfcAAqCa3Fj4fOgwAyAt9fR2oJjcf7MsnPiqVGhLSBXeK1sJkVt6+/aR7d9jpQXaYmxstWzYFdwqZwuPxZbs2BQXl5nr2LK6sjIk7BQA/TU1NRVVVRQIbuBcv3rq6OqiqwoKhko7NZq9evRd3CgBk3Pv36S9fxuNOAQAAUu/gwXNXrtzDnQLIncGDw7hcSTnhvQWdO3cLZpbIGCUlupWVKe4UQJpAQbm5FiwYb2FhjDsFAE1x5syNq1cf4E5R3/37L3v29MWdAvwYj8ePiXmHOwUAMi41NevNmyTcKQAAQOopKdFoNEncOwTItlWrZty+/QR3ihbG4/EvXbobGtoXdxDQkj5+zJo1az3uFECaQEG5uWxszB88eDVq1CJf31Hdu0+aOvW3tLRs4jvwkJCZzXxyLpfL5XLrXsPn869dezBhwgp//7Hduk2cMWNdfHxKM1+lmVgsWO9PWg0dGpydnYs7xf+orKyOjk7q2tUbdxDwYzSa4t69Kxq6dcWK3c+exda9ZuHCbfPnb/nh0/6wScnLK/T0HC7yX2Tkm+/v3yKtMQC4tG9vHxjoERw8hXiTd+06Yd68LU376I+JSfb0HB4b+9/3QPv2/fPqVYLw4l9/XejceYzIYzAsbNP27ceFF+GjHwAgIaZO/W369LXCi5WVVdOmrRFeZLFq/fxG//33RYTQxIlDvl9/YPLk1WvXHvj+aWtr2bt2nezde5qPz6hBg8JEdjAAEIeHRzvZW7rq2LErkyYNxZ0CtDCBQMDj8cPDHw4eHObjMyo4eArReLaehlpgIC1aeFM+OTR69OLU1MxevfzHju3PYtW+fZuio9Myi3OlpWXPmrX+5Mktxsb6xDUCgWDFit0PHrwMDu48ZEgPFov96NHrpKQ0d3c8OwPy+fwRIxaMHz9QkhfjB40wNtZfsGAC7hT/49q1B4MH98CdAoiFTCbb2Vk0dKtAIFi//s8LF3b81HqFmzb9zeVy16yZ3ch9tLU1tm5dQPx87NhVDoc7ffoI4mK7drbivxYAUsHGxpz4wc+vw7BhwXl5hffuvZg2be2GDWG9ev3crqru7o5qaiovXsR7eHzbvzsy8k1VVbWPjxtx8dmzOC8vFzqd9v1jPT3bGRjoED+Lc5wCAEDbCAry2r//bGVllaqqCkLo+fO3cXHvS0oYOjqaCKHo6GQ2mxMY6EHsR02lUjQ01MR52gMHzl69+mD+/HG6uloJCaliPgoAkcrKyo8cubx48STcQVpGeXnlzZuRN25AHVDWODhYjR7db+HCbb/8MrBjR6eMjC8qKkq4QwGJBgXlZklK+piamtm3b+CGDXOJa4YODW6pJy8tZZaVlde95uLFiAcPXi5fPlX4KiNG9G7tNXAFAgGJRBJ5E5/Pz8qSrPmt4Gd9+vQlMzOne3dJWWLi6tX7x45twp0CiIXFqp02bc2pU1sbukNpKXPduoO7dy8X/zk/fcoxMzNs/D50Oq1Hj2/v2PDwh2w2R3ixxTXSAALQNqKjk4qKyoidUgIDPYmP/sWLt2/adMjHx+2nahxUKrVz5w7Pn8eFhY1BCGVl5WZn59XU1C5bNpUotXz48GnFimkiHzt+/CDhz+IcpwAA0Da6dPHavftUdHQycX5bZGQ0QigqKnbQoO4IoefP43R1tRwdrRFC58/f1tfXHj68tzhP+/RpbO/e/sOG9UIIdenSSfw8rddzgD6J9NLSUm/f3uHs2X9Hj5aFRYf//PP8zJmhuFOAlsdmsyMiokxNDefOHYsQ8vf3aJGnhbZLhkFBuVmuXLlPp9MWLZrY0B0OHjx39eoDEgkNHNht9uzRJBIpPT17795/3r1LJybTLVw4gViC+fLlu9u2Hdu8ef6ePaeLisr27Fk2e/ZGhNCAAbMRQgcOrPb2bn/u3G0HB6shQ3rWfYm6B+fly3dPn75ZXFxmamo4cmTvwYN7kEikgwfPffmSb2ioe/36I4FA0LOn35Ilk6hUKnHm+I4dJ16/TlRRUXJ3d5w7d6yJiUFWVu6wYfOXLZty/fqj1NSsjRvn+vq679hxIiYmmcmstLAwnjUr1M+vA0KoV69pCKF16w6uW3dw0qQhs2aNQgjduRN15Mjl3NxCfX3t0NA+o0eHfP/bPXlyEpYwkxDW1mYrV+5xdrY1MtLDnQU9evTKz6+DlpY67iBAXBQKpZFb7ews3r/PuHz5LjEeq+v9+/SdO0++e5euoqLco4fP/Pm/0Om0JUu2JyR8SEj4cOtWpJub49GjGxpqpkS+XEVFlciWqq59+/65cuXeyZNbLCyMWaza/fvPRkQ843C4dnbmM2eGEtM2ly/fxWCUu7raX758T1tb4/Ll3c3+OwHQdDk5BZ8/f617DZlMnjJl6JMn0Y8fvxk0qHtJCWPHjhPPn7/l8Xienu0WLZpoavqt2nvlyr2LF+9mZ+dpaan36OEzb964Ll06RUQ8Kyws0dfXiYx8o6GhVlhYkpKS4eRk8+LFW4RQYKBnvY/s69f3jR69hMmsGDu2//z5v4g8TmNikvfuPZOWlq2np92tm/e0acOVlZXqdSfCw/cJz7gCAICWYmpqaGVl+vz5265dvblc7osX8RoaapGRb/6/oPw2MNCTRCKxWLWpqVkXL97dv/9c3Q99YnbFxIkrP37MMjLSmzJlaO/eAQghOl0xM1PEvJnc3IL9+8++eZNUU8NycbGbMWNkx47O3/ccDh++dPfu869fi/T1tYcODR47tj9C6PXrxNmzN/Tu7f/qVSKLVduhg9OiRROFm/GIHMdBn0Rm9OrVGXeElpGenp2fX7x06RTcQUDLy8zMjY9PraqqKS+vVFdXrXtTQ+Omhw9fnT59IyPji6qqckCAx7x5Y1VUlBFCwcFT+vYNzMsrfPYsrn//rsuXTxXZeCKEmMyKZct2PnsWp6OjGRY2RjhPSGTfsl4fNSLib01NKB3gBAXlZklOTrOzM1dWpou8NT+/uLCwdPfuZfHxH3bvPmVra967dwCZTKZQyMuXT6VQyPv2/bNs2c5z57YT9+fz+Veu3Pv777UcDtfERH/r1gXLlu08eXKLgYGOhoZqWRkzJyd/woRBwgoyl8vl8fgIIQUFKplMPnHi2vHj4WFhox0drRMSUv/44xifzyfqOPfvv5gyZdjVq3tSUj7Nm7elXTvbgQO7MRjlU6b8ZmdnsWfPci6Xd/jwpVmzNpw//y3MiRPhmzfP19JSJ+qMxcVl06eP0NHRvHgxYsmSHdeu7dXT075wYUfv3tMWLpzQs6cf8Ud4/jxu9eq906eP6Nq1U2zs+507T6qrqxKrldX97aCaLFGOH99cb6luXE6durF2LZxDLTXodNrx441NJ9fUVJs9e9SyZbs8PV0sLU2E15eUMGbO3ODt7Xr8+KbCwtLNmw/V1rLXrJm9evXMgoISIyO9RYsmKihQEUINNVNKSiJaXSUlmsiWSniH+/dfnDlzc8+e5cTgbcmSHbm5BStXTtPW1rh9O2r27I2nT28lFvF4+/aDjY355cu7ampgoViAWd++gXw+/99//2c/H2trM2KKsUAgWLhwW1VVzc6dSxQUqHv2nJk7d/PFizupVOrhw5eOHLkyceJgPz/3wsLSnJx8KpXq6+tGpVJfvIgfNKh7ZGT0yJG979yJevIkxsnJ5vnzt05O1rq6WvU+sg0MdI8e3TB37mbipb8/TuPjU+bM2TRqVN/Fiyd9/Vq0Z8+Z/Pxi4bo0wu6EoaEujr8fAED2BQV53r4dRSxwwePxfv31l23bjtbUsL5+Lfr6tYhY74L40F+7dtb3H/rl5VXz54+j02lXr95ftWqvqqqyv7/HmDEh69YdDAvbNGPGSOGCWiUljEmTVunoaK5ZM0tNTSU6OklT89tpIvV6DsXFjAEDutrZWcTEvNu9+5SpqYFwmrOVlemCBePz84v/+OP4vHlbLl3aqaCg0Mg4DvoksmTNmv2rVs0gPj2l1O+/H12yZDLuFKBVqKmpBAZ6/Pvv00mTVs2YMbJ7dx9h6amhcVNFRZWLi92UKUMLC0t37TqlqKggnG156dLd5cunEt89NNJ4vngRP3v2qLFj+1+8GLFq1V5nZxtjY/1G+pZ1+6hQTcZOitsySVBRUVVeXpmWlu3sLGLhTn19HaI65uJi9+DBy1evEnv3DrC2Ntu1axlxBw6Hu2LFbgajXHgkLF48STh/R01NhThBhhjdMZmVCKG647EVK3Y/evQaIbR374oOHZyOHLkyb944oufh4mKXm1sQHv6IuGhoqDtjxkiEkK+vu5OTdWJi6sCB3c6fvyMQCLZvX6SgoIAQcnCw7NFjcnR0srm5EXFya/v2DsLX2r9/FfGDg4NVr15TExM/du/uQ7QCqqrKREKE0OHDl3183KZOHY4QsrW1SEhIPXXqhnD7i7q/HZAcdLpibm4ZnU6jUhubbdranj+PMzc3qlt2BBJOIBB8/VrU+EHt7+/Rr1/QqlV7T5z4r/R87ty/ZDJp/fowOp3m6GhdVFS6deuROXPG6OhoUqlUOp0mbFIaaqaIE//roVKpIlsq4pqPH7PWrTs4f/44YrnYmJjkFy/eXriwg1ig1s3NkZhx+euv44lS+Pz544gzOQDAS+RXsIqKClQqlcvlvXqVkJycdvz4JldXe4TQ6tUzhg2bHxkZ7evrdvz4tdDQPsSnv5CKirKnZ7sXL976+3dMTk5buXI6j8ePjHwzZcrQV68SiDl0hLof2ZaWJsLRr5qaSr3j9MCBc126eM2bNw4h5OpqT6FQli7dUVlZRdxarzsBAAAtLijI68SJ8PT07MjIaB8ft+Bgvz/+OPbqVUJOTgGdTvPyciU+9Nevn2NqamhnZ1HvQ9/d3ZGo9np7t584ceVff1309/fo37+rhobavn3/jB+/vEuXTqtWTdfUVL9w4U5VVc0//2wjGsAOHf7bxqZez2H58qnED76+7k+eRL95kyQsKPfpE6Ctramtrbl27eyhQ+dFRkb7+3dsZBwHfRJZMn36iHXrDmzcOA93kCa6fv2hq6tdI3uoAKlmbKy/fPm0ESN67959etmynXZ2FmvXznZwsGpk3EScC0JITc2Kjk4WXgwI8BAWgg4ePNdQ4xkc7PfLLwMRQjY2ZnfuRMXFvTc21m+8bwllJckBn0zNoqxMr6mpFbl9DUKITP5vMQodHU0G49uCyJGRby5duvv581eifldSwiAKyiQSycrKtKHXIhZELylhCq+ZM2d0UJDXmjX7EUKpqZksVu2ff54/fPgScSuLxRY5dVpTU52oTScmppaVlffrN7PurSUlDKKgbG39P0k+fsw6cSL8w4dPxBnuJSUMkSHT0z+PHz9QeNHR0erRo9c8Hu+Hvx3A68OHzGPHrq5ePVOM+7aWY8eurlgxHWMA8LNYrNrJk1ffufN343f79ddfRo9e/NdfF4TXpKV9dnCwEracjo7WAoEgOzuP2D+nroaaqYZeq6GWisViL1y4jcWqtbExI65JSEhFCM2cuV742IqKKlvbb7ufWVgYw8gNSIjbt5/m5OTXu7K0lMnlcjU11dLSsikUinD2nKWlCZ1Oy8zM0dXVZLM5nTvXX/WFqLwcOHD28eM3RGGFREJHj16JiHhWWVkdEPBtsTzxP7IFAkFSUpqCAjU4+NvZr8S5U6Wl37or9boTAADQ4lxc7LS1NZ4/f/vkSXRY2BhlZaXOnTs8eRKTn1/s7d2eRlMkPvQ3bz5EIpGIk5zqfugLkUgkf/+OR49e4fF4FAolMNAzIMDj9u2nW7YcXrPmwJ49y5OSPjo6Wgm/TqurXs+hpIRx8mR4dHQycfK4yK6LhYUxnU7Lzs7T09NqZBwHfRJZYmysL73V5PLyyqNHr8JefDKMyax4/z7D19d9376VCQkf1q07OHv2xps3DzQybmKz2Zcv37t370VxcRmfL2CzOcL71O0BNtJ4CldQVFKiq6goMRgVjfctoawkUeDDqVns7S2jo5N/avHZU6eu//nn+Xnzxvn4uKWkfFq9eq9wVz0SidTIauV6etra2hrR0UnTp48grjE3Nxae+kQcYytXTndz+28ekMgJpyQSIl6Rx+M7Olpt37647q1qaipfvxYhhCgUsvDK+PiUGTPWDxzYbfPm+TSa4qhRixvZCbDur8Dn8+te3zZrsTs4WMKa7z+re3cfFqtWuCN223vw4KWVlamw2AekAoVCFpafGkGn0zZsmDtlymodHU2i51GvASEairrfwAk11EyJfKFGWioGozwwsCuZTN669cj589sVFRV5PB6ZTD59emvdZaDp9G9TQUWGkRwGBjp4zycAbYlKpXx/bmx0dBJCyNnZJi0t+/tPXhKJRPQKRK5yHhjo8fvvR44evdK7tz9xOpGVlem+ff8YGOg6OFgR9xH/I1sgEPB4vJEje48bN6Du9draGsTSz3W7ExJIS0uNRlPAnQIA0CwkEikw0PPcudsMRgVxDlNwcOfNmw9VV7OImcLEh/6mTfOUlZWElQjhh349PB5f2ACSSKR+/YKKikr37z/LYtXy+YKGdo+o23NgMivGjVtmZma4aNFEfX3t1av3iRw68fl8Lpf3w3GcRPVJKBQyrF/UfA8evOTz+cHBUraq8qpVezZunIs7BWhF+fnFJ06E+/q6E9OQ168PmzBhRVzc+0bGTYsX78jI+Dx//i8ODpbnzt2+e/e58A5k8n89wEYaz3oEAkEjfcs3b5LarKzUIoyN9aQobRNIdC9f8vXpE8BkVtSdeVdVVS2ciSzSpUt3hw/vFRra19LSxNRU9NZSBGIMVl1dI7xmwICu8fEf7t377ygV9k5sbc2pVOqHD590dbWE/xpfU8bJyTozM5dKpdR9iMhTa8PDH9nYmC1fPtXR0drKylR4RJDJZBKJVF3NEt7T2tr0w4dM4cX4+A+WlsZith0tJTU1q+FyN2hQv35BuKrJxHL+U6cOw/XqoGkUFRVXrJgmzj3btbOdPHlofn4xcdHGxiwj47Nw5e6EhFQymUysa0yhkOs2euI3U420VAghXV2tVatmhIWNKS+vOn48HCHk5GTD5/MzM3PrPrOqquhStaQpKCghhqBAHgQHd544cUjda1is2mPHrunr63Tq5GpjY8bj8YiyMkIoJSWDzebY2JjZ2JhRKJS6Jx4KGRjoOjpaFxeXde3a6f9fwq+khBEQ0FHMSHWPUzKZ7OBglZycXvdQ0tXVqjuKkGRlZRW1tRwx7ggAkGiBgZ7FxWUeHs7Et87+/h3ZbA6Xy/X37yj80KfRaF5ero1/6Ccnpzk6Wn3fghHXODhYpqZmlpdXNh4mKiq2sLBk+/bFHh7tzMyMVFWVRd7t7dsULpdra2vehHEcLjweX9idA03Wo4cvny+Ij0/BHeQn3LwZ6eXlCmtYyTZDQ90JEwbVu5JCoTQ0bvr6tej587jFiyf16OFrZmakr6/dwBOL23gSpL1vWVdeXlEjczFlAMxQbpYuXTppa2ucOXOztJTp6+teXl55+fI9Z2eb9evDGnqIkZFeSsqnrKxcBqN8376zRGnA1lbEOkTW1qZUKvXMmVsjRvTi8fiurvaTJw99/vzt6tX7Xr1K8PBoV1FRdeXKfeKQ09BQGz263+nTN5WU6J06uRYVlT55ErNq1fRGzpAKDe1z82bk/Plbp0wZqqam8upVoo2NmXBXzXqZnz2LS0r6qKREu3AhgsPhFhSUEK9ra2t+48ZjR0ermppaHx+38eMHLV++659/bnl7u0ZFxT17Frd58/ym/31B29q580SHDk5du3q38etevBhhZmZoYADzHaQMl8t98OAlsRn6D02cOPj587fEz6GhfS5ejNi06e/Q0L6fP389cuTy4ME9tLQ0iNM+bt2KfPYslk6n2dtbit9MNdJSEdN8iHZy3ryxmzcf6tPHv3PnDq6u9mvXHpgzZ7SlpUlaWnZBQUm91WYBkATl5ZXE+YOfPn2JiIhiMCquXn2Qm1uwe/cyKpXq59fBzs5i06a/FywYz+cLfv/9iLW1WWCgJ4VCGT6818mT4VQqpVMn16KisqioWOGup0FBnkVFpcSyywihnj39/v77YlCQl5iR6h2nM2aMmD9/65o1+wcN6s7hcO7ciZoyZZiJSWNfmQMAQMvy9m5Pp9OE35PR6bTAQM+8vEJitgTxob9s2c6BA7v17OlX70M/Pv5DZOQbNTWVqKjYZ8/iduxYghDau/cMg1Hu7u5YVFR24kR4cLAfnU4LDe1748bjuXM3T5o0RENDNSoq1t3d0d+//tlaxBze58/j2rWzffo0Njb2nb29pfDW69cf+ft3zM7OO3jwvI2NeUCAB4VC+dlxHJB2xElC0iIj40t4+IOjRzfiDgJal4aGWm0te86cjUFBXiQSOnPmlpmZobu7o6Kigshxk6amGp1Oi45OtrU1T03NPH/+TkVFVXV1jbKyUr1nFrPxFIK+pbSAT6nmmj59RE5OQWRk9P37L9XVVfz9OwqXpBBpxYppa9ceCA1dZGZmOGPGSB0djfPn73TuLGJakLa25qpV0/fuPRMZ+SYw0NPV1V5JiX748Lpjx67dv//i9u0oFRUlJyfryZOHeHu3RwiFhY0xMNC5dOnuoUOXdHU1AwM9ORxuIx0RQ0O9kyc379p1avXqfVwuz9HRqu5pVnX98suAzMycGTPWqampDB3aMyxszOnTN2fNCqVSqWvWzFq9et/s2RsNDXXPn9/evbvPkiWTzpy5tW/fPwYGOkuXTpG6c3nk2dy5Y5cv3yUsKIeEzLx168/WflEul3f16v3z53e09guBFsfhcPfsOSNmQZlCoWzYEHbw4HliguTevSv27j0zfvwKVVXlfv2C5s4dS9xtypShWVm5S5fu1NJS//33he3a2YrZTDXSUtW9T//+XW/ceLx165GDB3/bv3/lwYPnDx48X1bGNDc3HjSoW7P/JAC0vIcPXxFrR7x5kxQb+15HR9PTs93Wrb9aW5sR3+zu2rV0+/YTc+ZsEggEnTq5Ll48iTgxaMGC8draGlev3j9y5Iqenpa/f0cW69uuD0FBXkVFZcKJHpaWJu7ujh4ezmJGqnec+vt7/PnnbwcOnAsL20Sn0zp2dKq74BUAALQBGk3R19et7hdjwcF+nz7lED+TyeT9+1fOmLHu+vVHFy9G1P3QJ5FQjx4+Bw6cy8nJt7Aw/v33hcSTaGtrPHr0OiLima6u1siRvadMGUYsgHvkyPrdu0+vWLGbSqXY2pp7erp8H8bT02XatOG//35UIBAEBHhs3jz/99+Pvn37bTpqUtLHixcjeDx+p06uixZNIFrsnx3HAdmwevXeoUN7urs7iXFfnDZsOLh370rcKUCry88vevIkmsGo2LPntIqKkpeXy5w5o4muo8hxk5ISfdOmedu3H7969b6rq/26dXN27Tp57tztyZOH1ntmMRtPIehbSguSyAnY4eFBgwad+NnnKixMTE+/5+e3qIWySbSePSeXlDCIBVyI9QpJJFJMzCXcuQDq23fG5cu7vv9aDJc7d8K6ddtIo2ngDvIThg2b/+nTF319nR07lgj3emol27YddXGx69s3sFVfpaVwudUREQtCQv7CHQSzjh2HEsMzgUBALP0Fop4AACAASURBVAvF5/Pj4q7gziUvFi36IzS0T+P9sLb0+vVec/PORkY/XlC7nuvXJw4YcJREkr7z19rApEkr3r79QBxlwkWNTUz0r1+HzXBa0h9/HHN2tunXLwh3kG+Sk8/R6Vq2tr1/9oH37y/389uuogLb1EiBvLynX75c9/ae87MPjIs7rKvraG4u1ve4QGjIkLmfP38V9liIBTrd3R2PHdvUxklev06cPXvDjRsHjI312/ilW0ps7LuzZ/8lJnFLAiYzOy7uSNeuG3AHabqoqFgnJ2tis7KgoPFubvaSVrpdtWpP794BxOoxUqS2lvHo0eo+ffbhDiIdPDyGCVtI4gctLfX794/iziX1wsI2TZs2XHhSIHYvXmy3te2lr+/6sw8MD58waNCT76+HUVwTdenSiVhBWLiUsK0t7CcGZIGv7+isrFwymcxi1TIYFa36WikpGSUlDGmpJgMhZ2cbYm6jsOfh4+OGOxQAMmX69FAdHU0SiUQmk4luBoVCGTMmBHcuAACQMhMnDqbRFOvus6etrTF9+nDcuQBACKGAAI/09M9lZeVBQeOrqqq/fMkvKJCgVaovXYqwtbWQumoy+FnExul1B3cBAZ64QwEpAAXlJho9up+hoY7wIplMDgwUd/FBACRTaOhCD49hHM637Ymqqmpau6C8deuRWbNGtepLgNYwYkRvGk1BeFFLSwPqXAC0LG/v9k5O1nWvsbQ0Hjy4J75EAAAglfr2DbS0NK57jZOTtbe3O75EAPwPHx+3Pn2mV1VVI4Ty80tiYkRsqItFTMy7168Tv9+lDcierl071d3yzthYPzS0L9ZEQDpAQbmJrKxMfX071LloMmhQD6yJAGiueuVjPp9fVsZsvZcLD38QHNzZwsJYjPsCyTJgQDdLy/9OrLa3txC5EDwAoDnGjOmvqalG/EyhUIYM6amgAOtpAgDAz6FQKCNG9KbTFYmLGhpquL4F9/ZuHxNzSXrXuwCtpHv3iVwul/iZzWY/eRKDOxFCCDGZlefP396+XVKWNwGtavDgHlZWJsKLnTq52ttbYE0EpAMUlJtu9OgQYg9fYsN0Y2M93IkAaJatWxe4uTkIZ56SSKT8/KJWeq3i4rJ//rkF01ql18iR3yYpa2qqw/8jAK3Bx8dNOEnZ0tJ4yBCYngwAAE0REtLFwuJbrcTZ2cbHB6YnA0kREjKDyawUXiSRSJmZucXFpVhDIYTQlCmrli2bgjsFaCMGBrpdungR610YGemNHAnTk4FYoKDcdFZWJr6+7gghMzNDmJ4MZACxP8mcOWNNTAyIa4qKylrptbZsObxs2dRWenLQBgYM6GZlZQbTkwFoVePGDdDUVKNQKEOHBsP0ZAAAaBoKhTJiRC86XVFDQ23UqD644wDwnxkzQgMDPc3MjCgUCp/PRwgVFpbExLzDm2rp0h2LFk0itgoEcmLYsF5WVqYIIS+vdjA9GYhJ6gcnJWwOEghwvXq/kX2exr3z7upN09EsqWVjySAgkXQUFUhYXhtIHgFCJWwOqRkHRfCQHp5dOh07duX587iSWnZrvLEfP45WN9a3dLFrgScnkbQUFeT2m7GiWg4ZYWsAQ0L75B26NGB0CK7WDyHERyS9Oqs5A9Diqnn8mv8/EbXt2bg52nVwzs8v6hLSBeOBRiWTNaCcDQBoHiaHy+Xzcb16514B/1x/pK2t4eDhgrE5VaJSlSly22+VU42/33x7+vn29MvNLUhISH36NDo3t6iwsOROVKxXV+82zPg/zpy5ad/R2dbdsWlHComEtBUVWyGX7Gv+QL45yOqqvj19mRG8kNB+GBtJRCLpKMLgTmpI8fDgr8y8Z8VMS2WlnBoWxhhGy2d9QGhh0idcAShkUkktx1VdZZipnpuGKq4YksPR0Yokl/X1RGbVpdyiJGalDk2Bx2/251CvIOMgHy6d1hrv7SplZZUgnxZ5ZhIiMTicdhoqw4z13DXl5f2fW1N7+nNBDKPCTIlWxsZW6kKGhua/hZ1H6Dy+BtCArphVzfLRUp9iZaRGpeCK0cYMDLQpMBxtfREFJTe/lvIFqIbHw5ljZIgGQkveZ2GMoE9XLGFzeuhpjTKTtaU/NTXVFGHcAkArO5dT+KCwTFtBoQhjkQIh5VnjWFhHbQghJQqFTCb1N9DqXWd3dylCoZANpTM5Fn9/yosqYVoq03Nqan98bzNTNMZUm8tTqmEx1VRwvVG5XB7P0Y5GU4xqagATZVpOdW2grsZkS6OWTiezksqrLucUJTArdRUVuPimS6IOrkYdXPeWsxC+dtJUiZZZzfLX0ZhpLQs7LRkb65PJslycksqCMpvPD0tID9DVnGxppASDaoQQQiUc7rHs/NFmBt5aarizYPbhQybGRhiXGEbFqez8XoY6/Qy1cWfBoJTDPf45fwRPr7OOBu4srS67mrUmJWuwsV6ArgZFPr88+V9sPr+wljMlNvVQR3s5mURZUFDK42Gb5CUnznwuyKiqCTHS0ZKPN9UPMTncj5U1v3/8vNTeHHeWlsRgVLDZHNwpAJBl2z5+ViSTR5jqa8jN976NK+NwX5aWF7O5Y80NcGf5aTwePz+/BHcKKcDhC+Ylpvtoq0+yNJLDOemVXF52de3ipIzfXazJMFr5keiyilOf83sbaPeRy4H892p4/Jwa1qTY1APudtJe7svLK+Q3f6qfBJPK/54FSRm9DLRsVejS/vZqQToK1FGm+mc/F7xlVIpxdyBTEpmVJ7PyR5sZ6Mhr4UNbgTrKVP9SblFsWQXuLK2rhM1Z9S5zhpWxAU0BqskERTLZVIn2q53p+NhU3FmAjLiUW5RRVdPLQBuqyUIaClQvLTVlCmX7xy+4swAApMbOtBwameyjrQ7VZCEtBWofA+1PVTUXc1tr72uA3cKk9G56mvaqSnJYTUYIqVIp7dSVXTVUV7zLxJ1F0iWVV53M/jrGzADWeRBSopDtVJWHGOv8mpiOOwv4Aelr4O4XlFoq043pNNxBJFE/I+3L0DWRP5dyi/oawdlnqK+h9rW8YtwpWtc/nwt6w3fXDehnoH0yOx93CiD1Kji858XMXgZwoIngpaVWzuV9qKjGHQQAIAXSKmvKOBxvbXXcQSRRsIH2qxImE+PCZaDVPCosM1eimyrJe73CXlVJQ4H6srQcdxCJdjm3qA8sIyOKLk3RTk35bmEp7iCgMdJXUE6vYimTpS9221CnUtMqa1hwKrQ84fAFKeXVMI2OeP9/lPX3fwKzCr6+boimIvU91LlAs+XUsAQw+79hZBL6Is5akAAAuZddzSIjaE8bxBMIcljQnMqgtMoaulxOTP6eIpmUUVmDO4Xk4gsEicxKGNw1RIVMTquA949Ek76WjsHlqEHtrGH6NIViNs79LkAbK2azDeT+C3Ah2X7/CxBiCwTqcNJoAzSolEq8m6cBmVDK5mgqwFHWIDUKpQwWHQYAiKGMzVWD5rRh2orUklpoTmUQk8vVUIR6BSLWy2Jw4E3eoGI2xxDOvG+YhgIV+pwSTvoKyhyeLK9p3XxsgQDBXAB5QkIkFhTR/p9sv/9JCFXzoAVskAAhFleW56eDNkIiwfuoEXyEZHpzEQBAi+EjgRxulC0+ngDBCTEyiSMQyPY2XOLjCwRc+Es0jASDl0YJEIJVgSSc9BWUAQAAAAAAAAAAAAAAAGABBWUAAAAAAAAAAAAAAAAAYoGCMgAAAAAAAAAAAAAAAACxQEEZAAAAAAAAAAAAAAAAgFigoAxkjZ6eFokEO1wAAGSWuroKmQwf3wC0AGVlOoUCfQYAABALmUxWV1fBnQIAAKSDhoYq7gitC0akQNYUFZXBltIAABlWXl7F58OW0AC0gOpqFo8HfQYAABALn88vL6/CnQIAAKQDk1mJO0LrgoIyAAAAAAAAAAAAAAAAALFAQRkAAAAAAAAAAAAAAACAWKCgDAAAAAAAAAAAAAAAAEAsUFAGALQRNotVXVmBOwUAAAAAAAAAyDUej8csLcOdAgAgxaCg/BNKi4qa/FgOm13JLG/ROABIk4fXbkzt2Tfh5RvcQUBTCASCsqLiJj+8OY0nAPKjOV2FZh6kAAAgS8oZDA6H07THwqgNyIOsj2kzgvtfPnwUdxCABwzuQIuAgrK41s+YsyVsQdMe+/rR49khQ5LeRLd0KACkRgWDwePycKcATfE1+8vCYaOvHjvRtIc3p/EEQH40p6vQzIMUAABkybkDf4X1H1LVpKIwjNqAnKitqamprsKdAuABgzvQUqCgLK70pHcICZr22K/ZX6or4Ex/AIBUKmeUFeblNfnhzWk8AZAfzekqNPMgBQAAWZL54WOTJzHAqA0AIPNgcAdaChV3AEmU/u79P7v3Z6Wl05WVOvj5Tlu1LKz/UD6f/zX7y1jfIDKZfOr5Y4TQwbUbEl6+Ztey9QwNewwbFDxsCEIoOTpm69yFzh4d2KzarLR0C1ubLgP6XT50FCF04Lf1B35bP2za5EETf8H9KwLwc7gczrXjp579G1HOZBqYmvQYPLDb4AFkMvnSX4cjLl7uM2pk1O27lUyGpb39LwvnWtjZEY/Kycw8u/fgh/gEBUVFFTU13L8E+DEej3fh4KHnd+9VV1YZm5uNmTdHR19vw4wwhNDj8JuPw286e3qs2LezvKxs5+IVOZmZJBLF1Mpi6LRJLl6eCKFLfx2+fvKMb3D3Tykfir/m9x45/MXdB983ngDIOWZJ6bFtO9/HvkUIWTk5zPhtecKLV993Fd7Hvj3+x86ir/nKqip2Lu1Gz5lpYGbK5XInBHTXMTS0dnRIjo5FAsHSPdu/P0hx/4oAANAWvh+1HVy74X1MLEJoTv8hCKHVf+1zcGsfceHSv2fOVzKZGro6Hf07D5s+WVlF5fvmdPDk8TBqA9Ii433KhT8PpyUlq6ip2ru6jpw5VdhJ8A3uXl7KzHj/XllFJSCkz5DJEygUCkKoqqLi3P6/Yp48ZdfW6hsZ4f4NQFsQc3Anss+JEFo3fXZaYnK3wQNinz6rYDDnb914/PcdMLgDQlBQFmHfijUlhYW9Rg6jkClsNgshNGD82FM796hpqPcOHUkmk4i7qaprdOoaRKUqRD95emrHHnUtLZ/uXYmb3se+9enR1atbkImlhbaenmsnz6Q3MT49u5vb2Dh2aI/1lwOgKfatWhf7NMq1k6enpWVaUvKJ7bsK876ODpuJEKqtYT28Gt5z2GAOmxNx7uLORSu2nT9FU1Iqyvu6fnpYdUWFm5+3jp7+m8gnuH8J8GNRtyNunz1vbmfbffCAjPcp6lqaqpoa3QYPeHTtho2zk0dggJ6xIUJIVUNDSUU5oE9vVnXNq0ePty9cuuPyOR19feJJ3jyK7DNqJIVKDQrpo2to+H3jCYCcO3fgz9inUR0DOpvZ2nx6915dS8vG2en7roK6lqa+sVF7n04FObmxT5/lf/7y+7lTxDOU5Oera2kMnTqBTKEYWZh9f5ACAIA8+H7U5tOje8a7lIKc3IHjx9KUlHWNDBFCSioqDh3cNLS1UuMT7126wudzJyz6dr523ebU0a09jNqAVMh4n7JxRhhdVSWwX282q/bN48jkmNgtp49q6OgghF7ee+je2Xfo1ElvHj6+fvyUhpZm8PChXC5367yFmSmplg721s6OH+IScP8SoC2IObhrpM+JEIp5EtVv1IjKiko3X2+RlTEgt6CgXJ9AIKiqrFSk03qNGKpvbExc2XPY4NO79qpoqA8YP0Z4z18WzCX2bfAMCtgyd0HCi1fCgrK1k+OcDWuF93Rwd0t6E+MR0Nm3Z/c2/4UAaK6Ut/GxT6Pc/LynLl+KEOIL+Ounz7l/6cqwaZOIO4xf+KtPj67EtLun/97JTP3o6O5288zZ6oqKUbOn9xs7GiGkbaBPTPoAkqyqogIhZN/eZfCkCSTSty5C5149H127YW5vK2wAyWTy0j07EEI1VdUGZiaXDx19HxMb0LcPcWvv0BGhs6YTP4tsPAGQc5XlFQghv+CeRMuJEDK3s/2+q2BqbbV45zaEUHVlxb6Va5PexBR9/aqlp4cQUqDRluz8Q01Tg7jn9wcpAADIPJGjto7+fhEXLhfk5PYcNkRTV4e4Miikb1BIXz6fX1HGWBw6Lv7Fa+GT1GtOYdQGpMKFg39zOJzF69cYW1oghCzsbc/s3h958/bACeMQQsYWFou2b0UIeXUJmD94ZMzTZ8HDh8ZEPs1MSXXv7Ltg22YymZyakEhMUwWyTczBncg+p97/T2OfvnKZm58P8TMM7kBdUFCuj0QijQ6bdXzbjiWh43x6dB80abyhqcn3d+PxeHfOXnhw7Xrx13ziGmZpqfBWG2enNowM/oeDgwWs6dOyUhOSEEIJL14TJw8K1VR+28mBqvCtJdEzNkIIMYpLEEIfE5MRQv59euOIDJooKKRf1O27D66Ev4uJ7TtqZJcBIcKeR10VDOblw0df3n8kXGeQUVwmvNXG2bENI8sjfX0t4tRFIKUGTxr/MTF5/+q1j8KvD5rwi7NnR5F3+5yWfunQ0eToWE5tLXENs7SMKCjrmxgLyx+gOTQ11RQVFXCnAAA0hZijNmJ63fWTp7M/pvN5PIQQm80W3gTN6U+hUMgGBjq4U8g7gUDwMTEJIbQ57Ne61wvLEdT//1zTNTQkkUhlRcUIoY8JSQgh/97BZDJsoyVHxBzciexzCgvK1lDdaipjYz2Rf3CZAQVlEboN6m/n4nz95Jlnd+7GPn2++dQRokxW160zZy/9dcTFy2PE9Ck8HvfvDVu5dTZ/IMNQH5/U1GyEZPmgbXs8Lhch1HVgiGdQ4H/XkkgaOtr17kk0lwKBQPgzHAvSRVVdbcOxvyNv/HvrzLmjW7fnfMoa96uIyQsHfluXHB3bY+jAdp4eHxOT75y7yONxhbeSyfCf3roKC8t4vCZuNwQkgY2z0x/nT98+d+HBlWubw36dt2W9V5egevepqarcHPYrl8MZMmmCkYXZw6vhSW9iiNYYIQTfKLQUBqOCzebgTgEAaCJxRm0fE5J2L1tlYGYyYdGv6lqaJ7fvrjsNCJrTn8Lj8QsKSnCnkHcCgYDH5SnQaPO3bKh7PbHobT0kEunbXKtvQzOoJssXcQZ3jfc54W3THHl5RURtRFbBO0MEPp9vZmszZ8OafmNCa6oq3754hRCiKChUllcI3w3Jb2IQQtNXr/Dr1dPSwQEh1MisWCqVihCqZJa32a8AQAsyt7VGCOVmZbt6e7n5erv5etu5Orv5dGr8UUbmZsRyum0VE7QMCpXac9jgzaePUhWoL+7dRwhRFRQQQlXl3yYjCwSCdzFxhuamExYt8OoSRIzcGvmkrNd4AgD4fL6Gttao2TPCNq5FCL289+j7rkJWalols9yvZ4/+v4z2DAqgK6s0cqDVO0gBAEBOiBy1KShQEUKVFd+a0+SYGITQ0CmTug3q38Hfj8vjNtIngVEbkHxkMtnE2opTW0sik4mhmaN7+3ZeHg3N0Cf8/9AMdrWROz8c3P1UnxMGd6AumKEswvFtO6rKK83tbBJfRyOEjC3MEEJm1lafUj7sXLJCRU2t57DBhmamKXHxJ3fssbC3fXX/IUKopKCgoSc0s7FGCF0/eTo3K1NBkTZm7uy2/YUAaJYOnf3MbG0+JiStmjDV0a19cUFBbmbW7+dOEX3uhvQZNeLNo8gT23fFv3ylqaMd8/RZG0YGTVSU93XrvIVeXYM4bDaXwzWyMEcIGZiaKigoxEU9O7x5G7O0dNH2rfqmxgVfcs/s2a+gqPAo/CZCqLSwsKHnrNd4wopAADyPuP8o/LpLJ6/sj2n/dTP+t6vQe+QwMpkc/TRKU0+HUVQS8+QpQqi0oBC5tPv+Cb8/SHH8WgAA0NZEjtpMbawTXr7+a91ma2dHSwc7Q1NThNDtsxcK875+TEisKGMghEqLitS1tL5/Qhi1AakwaMK4/avX7Vq8vFP3Lop0evKbmEETfwkK6dvIQ/z79Lx67MSrB48Kc/MsHe0/xie2YV6AjTiDu4mLfxXd52wADO6AEMxQFkGRRv/w9u21Yydqa2p+WTjPxcsTITRx8QJzO9vEV6/jX7wkk8jDpk3xCPR/Fx376NoNl05e3Qb2ZxSX5GV9FvmEbn4+vUYMY9eyn9y4zSiBs4SAlKEqKCzftzMwpC+jqOjx9ZtfMj619+7E/v/1lRpi59Juzoa1JpYWSa/fpLyNNzRp7DtzICHKGQwVNbWI8xef3opw7+w747cVxKlSk5Yt1tDSjrod8SUtAyE0a+1qK0eHR+E3Yp48GzZ1spWjw4f4hIYWYajXeLb57wSABOIzS8uunzid8T6lx9CBAyaM+76roGNgMGXlUiVl5Vtnzn1Oz5i6cqmapkZyTJzIp/v+IAUAAHkgctQWMmaUm593blb2i7v3uRyeT8/ufUaNKC0quv3PeaqCwpi5sxXpNOJk0+/BqA1IBZ8e3cI2rTWxtnr98PGbR5GaujqaurqNP0RJRXXp7u3OHh0+Z3yKi3quY2jQVmEBTuIM7n6qzwmDO1AXSeRM9fDwoEGDTvzscxUWJqan3/PzW9RC2URbn5JlpaJkp6r0w3ty2OwrR46LvIlCoTRU++jcqyfx1XQrYdfWXj0q+m8rEAgaWrE7ZOxoVXU1cZ7/WHb+b44Wpkq05sWUYn37zrh8eZey8o/fIW3jzp2wbt020mittd1HPou98n3mVMv6C8aJ9Dkt/cX9hz/7EoMn/kJTapm/p7S//7nc6oiIBSEhf7XS8//Q0NfvFtiaUsRY2r8wL4+YO/xTWvD/WqSczMxnd+6JvKmhZlmRpjhk8kRxnpzJ4V7MKTrq4dDsmJJu0aI/QkP7eHq64A7yzevXe83NOxsZefzsA69fnzhgwFGShPVEn5cwb+eX9jcSa9Ohl/cfZqelf389j8tFAkRREHEah4a2dp/Q4S2RtEFxz14Q2/WIz97VpWNAZ3HuGVXMNFOihZrpNzWdZPnjj2POzjb9+tVfwxqX5ORzdLqWre1P72d7//5yP7/tKioiFtAEkiYv7+mXL9e9vef87APj4g7r6jqamwe0Tq6WdyGn8HN1bYCuWH3gS4eO1l2yU6iR/qGdSzuPQP9mx2zM3YuXy4pFlJUbaeR9e3azsLMT58lv55d019cK0tVsiaRtJDb23dmz/+7YsQR3kG+YzOy4uCNdu24Q475tZ1Nqtimd5qCm/MN7cjicK4ePibxJS0+X2Ezvey37zm9Cn8HUysq/T7A490xkVlbz+PNtMXw21dYyHj1a3afPvrZ/afEV1bKXJGVOtxZrIP8lPeP5vQc/+xISOLjT1tcPHjZYnCf/VMVKraxe52TZ7JjYhIVtmjZtuKurPe4g37x4sd3Wtpe+vuvPPjA8fMKgQSIWzJHlJS84HM6t02dF3mRmY/0l45PIm6wc7Vu1oMxhsxtKpaOvV1JYJPKm7oMHiFlQA6ARedlfGnr76RoZFH8VvWxL39EjW+pzqGnv/26D+sP7/2eV5Bfg/b8WKf9zzs82y8pqamIWlAFoe7FPn7168Oj76xupgJhaWbZ2QflddOzdi5dF3tTQ4d9rxDAxC8oAANAa7l64xKqu+f76RvqHPYYOau2C8pNbdz6L+tawkUbe3NZGzIIyAFx2g/WK9r6dEl++EXlTy77zm9Bn8AwKFLOgDFpQbvZn2Rjc2Tg7iVlQBpJPlgvKyioqZ15K3KrzKmpqEpgKyAmfHl19enTFGADe/23GqWMHCfxTewYFSGAqAJpszoY1czaswZ2ivnG/hn2/hTcAAEiyIw8jcEcQYfOpo7gjAFmmpKKMvWMMfQZp4dO9q093nAN5kWBwJ+ck6zxTAAAAAAAAAAAAAAAAABILCsoAAAAAAAAAAAAAAAAAxAIFZQAAAAAAAAAAAAAAAABigYIyAAAAAAAAAAAAAAAAALFAQRkAAAAAAAAAAAAAAACAWKCgDAAAAAAAAAAAAAAAAEAsUFAGAAAAAAAAAAAAAAAAIBYoKAMAAAAAAAAAAAAAAAAQCxSUAQAAAAAAAAAAAAAAAIgFCsoAAAAAAAAAAAAAAAAAxELFHeCnqVIpVDIJdwrJpaVA4SMB7hSg7XAFfG1F6TuQW4lsv//5CBnTadD8NYSESLpwLIDmEyBlCnzd3iAFMlIkQTsEAPgxRTJJAZqLhtEpJJJAZnut8kyNAvWKbxTIZCXcGSQZT4B0aDB4aRAZCVQpFNwpQGOkb8ikoUAtruXgTiG5PlbWmNJpuFOAtmOqRE+vZOFOISlk+/1PRogr4JeyubiDSKgSNkcB6oCg2QyVFHNZbNwpJFcJm6tDV8SdAgAgBXQUFYvZ0Jw26CuLY6gks71WeaauQC2CegVCCKFCFltHUQF3CsllSFf8VMXiwxdLDShmc9WpUFCWaNI39nZQU6nl83GnkFDFtRxPTXUyzAWQM95aakW10F+Xi/e/h6ZaKRcKyqKVc7lu6qq4UwCpZ6lMV4FvJhpGIZH+j737Dmyq6vsAfrLbNN1770LLbgHZew/ZW5kyVOCRBwH3HqiAC1BBBUGQpYBssMrehQKFltJC6d5JZ5p53z/iizzYkbZJTu7N9/NXijfnfoPk9tzfPSNYioIyADQsVGon4OFyWic7viBEakc7BZhelKNUiwohIYQQhpBwB/wjr89Tbk6FePxQBxXDRDtJaaeA+rDvd3wPd6cqrTZBXkE7iDX6Nbd4RrA37RRgaTOCffbkFNFOQd9vucXTuf7v/7kQ373ZxWo9eqlPylGqLpdWTAjwpB0EWE/A443wdd+ZXUg7iDU6VlgaIZOGSDGBFQAaFiiVtHSSHikopR3EGu3KKRru6ybms+9mHBrUzc2pRqe7XGrr9YozJWUSAb+DiyPtIFZtRpD3bznFtFNYo+uKCoVK08vDhXYQqA8rV2x5Ozp05d3MBHmFu0TkJRHzuLtkqnF4FVpdqUpzolD+WdswP+7O94e6+NqJP2kTviQxmyTqWwAAIABJREFUfbCPi5tY7CgUEBv6UvAqtLoStTq+UPFp2zA/Cff//e/oHP3M1eRhPu7OQoGbRMwwNj1jg0dIkUpTptWdKyn/IS6KdhzgiN4eLlIhf+vDwlgXmY9ELBFwed6DMXQMyatR36+uaefsONrXnXYcAGCNKQFeB/KKjxSURjjY+9qJbf5qSlQ6Jl+lTlBUPhvk1dnViXYcMJc3Wgavupd1qbTcSyL2trOteoWekHylOl+ldhYK5oX60Y5j7XzsxJ+2DVuSmDbYx81NLLKxG/laMIRXqFIXqzSlas37rUJpx4EGsLKgTAh5pUXQwfzSCyVlelKZWU1zAdnq6hqJRCygNz1WyOM7CgVtXGTfxEY6Cdn6PxSayUci3hAXtT278HhBaYVWp7WZIiOf8J3FwtZO0rXtI11ENvHvXyLg736q1dbMgrMl5VqGKaG3OiHDkKrKapkjzYlIPnYSAY8X6+ywKa4FxRjAPZ1cnJyFwv25xUnlVXk1KopJNGqtXq+XUF22OEhqZ88XDPJ27eqG8gcANM5IXw8PsfiPIvmdiurMaiXFJKoaNY/PE1Nd0dXPzs5LIlwU7hclwzxujns5MvBIQem5krKbZVWZSpr/8i0sWGov4vN6ujsP8HKlnYUdfCTijXEttmcVHi8ordToNITajbxOp1fVqKVUVykJcrDnM6SLu9PCcH+KMcBILK6/jPBxG+HjRjsFefHF9198cUpMTATtIGDrHIWC+SG+tFOAhTwb5P1sEOX1PZTKmrFj/7PtyHd0YwCYSZRMuiwqiHYKsnfvH5mZef/5z7O0gwAANFFXd6eu7vQfR61bt93Ly23ChCG0g4CtGOrtNtSbfr0CrJ9MKJgXSv9G/u7dB2vW7Pnuu3doBwHWwLJNAAAAAAAAAAAAAGAUFJQBAAAAAAAAAAAAwCgoKDeXj48Hj2fzG0wAgE2KiKC/IAAAt9nZiR2prlQOAMANDg72EgnN9egBAKwZn8/z9fWknQLYBAXl5iouluv1trIBGgDAIzweLysrj3YKAI5TqzVVVTa0mQ8AgJkolTUajZZ2CgAAK6XX64uKSmmnADZBQbm5tFodRigDgA1iGEal0tBOAcBxej2j1zO0UwAAsJ5ez2AYEABA3XharY52BmATFJSby95eQggKygBgixwdHWhHAOA4Pp8nFApopwAAYD2BgI/LKQBAPeztJbQjAJugoNxcSqWKEAwdAgBbVFFRRTsCAMfp9QxGiwAANJ9Op8flFACgHkqlinYEYBMUlAEAAAAAAAAAAADAKCgoN5dUaoc1lAHANrm6OtGOAMBxAgFfIhHRTgEAwHoikUAoFNJOAQBgpXg8noODPe0UwCYoKDdXdXUNw2DJCwCwRXJ5Oe0IAByn0+mx+yUAQPNpNDqtVks7BQCAlWIYpqpKSTsFsAkKys3l7++FEcoAYIN4PBIdHUY7BQDH2dmJnZyw+yUAQHM5OjrY29vRTgEAYKV4PF5AgDftFMAmmPXTXDk5hRihDAA2iGFIcvJ92ikAOK6mRl1ejt0vAQCaq6Kiyt5eQjsFAICVYhgmO7uAdgpgE4xQBgAAAAAAAAAAAACjoKDcXJg5BQA2CzPxAcxNIOALhQLaKQAAWE8oxKZ8AAD1kUpR3YJGQEG5uZTKGtoRAADowEx8AHPT6fRarY52CgAA1tNqsSkfAEB9qqtR3YJGQEG5udzdnQnBpnwAYIsCA31oRwDgOJFIiNEiAADNJ5XaicUi2ikAAKwUj0c8PFxopwA2QUG5uUpKygjBpnwAYIuysvJpRwDgOI1Gi9EiAADNV11do1ZraKcAALBSDEOKixW0UwCboKAMAAAAAAAAAAAAAEZBQbm5/Pw8+Xz8NQKAzeHxSHR0KO0UABxnZyd2dpbRTgEAwHqOjlLspg4AUI+AAG/aEYBNUAltrtzcIr1eTzsFAIClMQxJTn5AOwUAx9XUqMvKKmmnAABgvYqKauymDgBQj+zsAtoRgE1QUAYAAAAAAAAAAAAAo6Cg3FwuLo48Ho92CviHr68H7QgAtgJfNyqcnBz4fPzesRUikdDeXkI7BWc5ONgLhQLaKQDAEuzsxCKRkHYKduPzeViFCYDD3NycaEfgFFdXjv99oqDcXApFBcMwtFPAP/LyimlHALAV+LpRUV5epdfj946t0Gi0SqWKdgrOqqpSarU62ikAwBJqatQajZZ2CnbT6xmswgTAYaWl5bQjcIpczvG/TxSUm8vRUUo7AgAAHZ6errQjAHCcSCSUSES0UwAAsJ5EghHKAAB14vF4Tk6YggCNgIJyc1VUVNOOAABAR1GRnHYEAI7TaLQqlYZ2CgAA1lOpMEIZAKBODMOUl2MKAjQCCsoAAAAAAAAAAAAAYBQUlJvL19cDmyMBgE1iWrYMpZ0BgOPs7CTYAQkAoPlkMqm9vR3tFAAAVorH4/n7e9FOAWyCgnJz5eUVY3MkALBJvJSUB7QzAHBcTY0KOyABADRfZWW1UllDOwUAgJViGCYnp5B2CmATFJSby9HRgRAUlAHAFrm7O9OOAMBxAoFALMYuUgAAzSUSiYRCAe0UAABWisfjOTk50E4BbIKCcnNVVFQRgiUvAMAWlZSU0Y4AwHE6nU6txi5SAADNpdFotFod7RQAAFaKYZjy8iraKYBNUFAGAAAAAAAAAAAAAKOgoNxcmBQAADbLy8uNdgQAjhOJBBKJiHYKAADWk0jEIhEupwAAtePzedgIGhoFBeXmwqQAALBZhYWltCMAcJxGo1OpNLRTAACwnkql1mhwOQUAqJ1ez2AjaGgUFJSby8PDlXYEAAA6QkL8aEcA4DiJRCST2dNOAQDAelKpnUQipp0CAMB6YfopNAoKys1VXCynHQEAgI6MjFzaEQA4TqXSVFYqaacAAGC96uoalUpNOwUAgPXC9FNoFBSUmwtrKAOAzcJDbABzwxrKAAAmgTWUAQDqgTWUobFQUG4uHo/H4/FopwAAoIDPxy8RAPNiGHzRAAAAAMC89HoGfU5oFPxzaa6yskqGYWinAACgID+/mHYEAI7TanVKpYp2CgAA1sOmfAAA9ZPLy2lHADYR0g7AVqNHv5iVVWB4ffHiDUIIwzDXrv1KOxcAgNnFxY1nGMYwOSM2dpzhRULCHtq5ALhjxYrVJ06cN3y5GIbZsmU/wzAxMRE///wJ7WgAAGzy7LMrbt9OM1xOeTzeypXfMwzTu3enzz9/hXY0AACrEBs7zjA2mWGYjh0nEEJ8fDwOHvyGdi6wdhih3EQjRvTh8/m8x4SFBdIOBQBgCV27tnt0ATS86N27I+1QAJwya9aYgACfx79ljo7SOXPG0s4FAMAyc+aMk8mkj1Yp5PF4np5uc+eOp50LAMBatGoVYXhhuE7q9fqBA7vRDgUsgIJyE40aNSA01P/xPxkxoje9OAAAljNx4lAXF6dHPzo5OcyYMZpqIgCuadkyLDY25vE/iYgI7tv3KXqJAABYqU+fzpGRwY//SVxcTExMBL1EAADW5emn+0kk4kc/Bgf7T548jGoiYAcUlJvI09O1f/8uj34MDvafMGEI1UQAABbSq1fHx+/NYmNj2rVrSTURAAdNmjTE19fT8NrBwX7atJG0EwEAsNIzz4x0cLAzvHZ3d5k27WnaiQAArMjw4b1CQv4ZLtm3bydvb3eqiYAdUFBuunHjBoWGBhBChELByJG9HRzsaScCALCQqVOHGwYpe3i4zJw5hnYcAA6Kjg6PjY02vI6ICOrXD8OTAQCaom/fpx49CI+Li2nVKpx2IgAAKyKV2g8f3ksoFBBCAgN9Jk3C8GQwCgrKTefh4TpwYFdCSEiI/4QJg2nHAQCwnJ494wz3Zu3bt2zdOpJ2HABumjJluK+vp4OD/bRpI2hnAQBgsWnTRkqldu7uLs88g+HJAABPGjt2UHCwPyGkd28MTwZjCWkHsBClXp9VrarQak3bbOSQXt5JabG9O6VodERRYcKW+YQXJJW4i0UmbBMAbFOFVvegSqlhGNM22/WZEemEdJo8PMGkVz9CiB1fECKVOAgFpm0WwKxya1R5SjXhmbRRH8/QAV3z84ucY1uZ9ovGMMRTIva3Fwt5pk0MANAsWobJUaqKVBrTXpycOsT49ohzdXVW+nmZuN/CEF97sZ+dxJRtAgDUrUStyVKqdKa+ues0eWjZHxfajhlg8ps7mUAY6CCR8jGelWtsoqD808P8U8VlrmKzFCdazZ1UTMjOrELTNusqEaWUVwdLJYsiAjxRVgaAJtExzKepWbfKqmKcpeVqEz9RI07O7V+YdomQS6a+ANoL+HcrlZ3dHP8bEWjalgHM4ZqiYkd2UalaEyqTlqnUpm3crl+3EDN0MyQCQZFKYyfgD/V2HeztZtrGAQCa5kSB/HBBiVKn9xKLavR60zYePGUkMcPl1FkiflBZ7SYSTQ70jHVxNG3jAACPK1FrvkzLeVhd09JJKldpTNx6RGjbiNCj5UpSrjRtwwyPV1yj7uXpMivYx7QtA13cLyh/nZ6t1DHzQn1pB2m0vh4uco32rdsZb7YM8rPHQ28AaJxKje61O/e7ubv09XShnaXRhhKSVF69POn+ytahfBOP+QQwpRtlld9n5E8J8BSxcNgFQ8i+3GItwwz3wdxGAKDsaKE8vrB0YoAX+37rezhr9PrvM/Lmh/LaOctopwEAbsqrUb2X/HCEr/swH1YOBThXUrYmLQsDhriEfTc/jbI9q7Bap+/l4Uw7SBO5ioRTAj1fuplOOwgAsM9bKRm9PVxCpGx9HNXaSdpSZv9xSibtIAB1ylGqvkrPmR7kzcZqMiGER8gYP4+TRYqLpeW0swCATbsirzieXzLO35N91WRCCCEiPn96kM/X6TnZShXtLADATUtvpk8K8GLvsqjd3Z11DNn8MJ92EDAZVt7/GEnHMMcKSnt6sG9o3uPEfH6si+xQfgntIADAJjfKKsU8nj/LJzfEODkUqDWZ1TW0gwDUbm9ucVc3J9opmquHh/P+vGLaKQDApu3LLe7F8rs2Qkg3d+c9OUW0UwAABx0rKG3t7GAnYHcFr6e78x+FcpOv/gy0sPufY/2yqmtcxUIOfEIPifAh6ikA0BhZSpW7mAuLGrmJ+FnVGOwDVipLqfJi7TiRRzzEojylGl17AKCFISRXqfKUsP5y6ikRZitx1wYAppepVLF3bPLjvMTijCpcJzmCA+XWOlXp9CydM/UEO4GgyNSb/AAAt8nVWgE75+A/QcgTlGlNvZ0ggIkUqTVsHypCCBHweDV6fY1ORzsIANgolU5XrdcLeKy/dbPnC4pU6LQAgOmVqDVi9l8kCSE8PqlEn5MrWH8XBAAAAAAAAAAAAACWgYIyAAAAAAAAAAAAABgFBWUAAAAAAAAAAAAAMAoKygAAAAAAAAAAAABgFBSUAQAAAAAAAAAAAMAoKCgDAAAAAAAAAAAAgFFQUAYAAAAAAAAAAAAAo6CgDAAAAAAAAAAAAABGQUEZAAAAAAAAAAAAAIyCgjIAAAAAAAAAAAAAGAUFZQAAAAAAAAAAAAAwCgrKlqZRqyvLymmnAACgoFyh0Gg0tFMAcBnDMPKiYtopAABYD5dTAID6lRYV0Y4ANKGgbFGX/vzrxRFjb12+QjsIAICl/bLu20Ujx1bhiRqA2eQ9zFo6fupvP26mHQQAgN1wOQUAqN97CxZ+vOi/tFMATSgoNw7DMJn30pr89ryHWdUVFSZNBABgIZXlFcX5+U1++4OUVJ1WZ9JEAByUkXqvye8tV8gLc3NNGgcAgK1wOQUAqEdOxkO1StXkt6fduk0IY9JEwDJC2gGsyJ2E65s+W1OUly+VOUS2bjV14fPegQGEkPQ7yTvWffsg5a5ILPYJDHiQcvfVrz9v0a6torhkx/pvr529wOiZoIiwMXNmtO7UkRCy+9uNR3ftGTpl0pnDxyrLFCFRUdOXLg6OjPxr/4E9G34ghKx76711b703ft6c0bOm0/7QAACEEHJ05+5DP++oLCtz9nCP7dF9/Pw5UgcHQsi5oyd+37qtIDvHxd3Nzs6uurLqo60/ypyd0u8k7/xm471bSQ6Osqg2bSY9P9c7MECr1c7s2b/roP7lpWXpd+5IHRx6jhg6ds5MgUCw/p3371xNIIQsHDmWEPLmt1+3aNeW9ocGsLT177x/48IltUrt6eMzYPzoQePHEkLkRcW/rPvm1qUrGrU6pEWL1Bs3Jy6YO+LZqYSQ88dO7N+yrTAr29XLM65Xj7FzZtg7yJKuXF25eOmIZ6feuHCpIDvbw9tn7NxZXfr3LcjKfn/BIkLIX/sO/LXvQEzHuNe+XkP7EwMAmF65XL5m2WvZDx7weIKA0OBx82Yb7sLqupxqNZq9m7acPXS0vKzMO8B/wJhR/cY8zefz67prw+UUANiu1j6nVqPZ9+NPZ44cL5PL/UOCS4uKwqJbLFv9aV19znpu7haNHKfX6/MeZj3TtTefz99y7i/anxgowAjlfzi5unj5+fYf83RYdMuE02fXLH/NsCjMykVL024n9xg6OCauw71bt929vaPatqksr3hv/sKLf/zZvttTPYYMLM7LX7l46bWz5w1NqZQ18b/t6z1iyKAJ4+7fSV7z8msqpTI8JrpN546EkC4D+09cMC8mrgPtTwwA8Dd7B4cWHdr1GztK5uR0fPevu775jhBy7cy5b979oKayavCEsS5ubtkPMlo/1clQTf5gwaKs9PRew4e06dzpxsWLb85ZUFJQYGjqwvF4kUQ0bu5sd2+v/Zu2xP+2jxDSZUB/7wB/QsioGc9MXDDPw9eH9icGoEDm5Ny5b+8+I4ZVV1VuWf3lxfi/CCGrl716/tgf0bEdeg4dkn4nmWGYpwb0NTzmWf/OB4SQfmNGBYaHHfll18rFL+v1ekNTh7ftaNGu9djnZimrqta/9V5G6j2Zi3O/MU8TQsJjoicumNf36eG0Py4AgFnInJ3tHaQ9hw7p2KtHxr20VUtXlBQW1nM5/fqNd/dv2uIXEtRv1EixWLx51ec71n1naKrWuzZcTgGA7Wrtc+78ZsO+zVtdPdwHTxhbqSirkCu69O/fYJ+z1pu7p2c8w+PxnFycJy6YN3HBXNofF+jACOV/BISFLlvzKSGkurLi69ffuXX5alFeXvrtZGV11dApE6ctfpEQkpc5J/NeWmVZ+R+/7i3MzZ2w4Lnew4cRQnoMG/z2nAWHt++M7dHN0NqMpUu6DOhLCCkrKT196MiDu6kt27dr0b7drctX43p27zqwP+2PCwDwj94jhvUeMUyv11fIFcsmP5t4/hIhJPHCRULIjJdfiu3ZvaqiYv6gEel3kgkhO9d/p9Folr33tl9IMCEkOCri5y/WnjxweNTMZwkhfsHBL69aSQjp1KfnS2MmXT19dtCEcbE9uh3duacgO2fg+LEuHu60Py4AHdP/u9iwPW/H3j0/XvzfG+cvRrdvl3E3NaRF1OIP3yWE8AWCY7v2PEi56+jssmfDDy4e7i9/9rFILCaECEXCS/EnU67fMDTVd/TImS//lxDi5Oq64f2Pr589P2b2jO6DB/659/egqIinZ0yj/VkBAMyFz+ev+HI1IURZVe0d6L9nww93ria0feqpWi+nxfkFCafPtOv21NxXVxBC9Iz+vfkLT+z+dfy82YbWar1rw+UUAFjt333OLv37Jp6/SAhZ/uUqqYNDZJvWX7zyxv07yZ369KqrzxnVrk1dN3cDx4/Z+vlXDs5OuEjaMhSU/5F5L233hh+SriRo/n8dmbJSuWHVi1uXr6bdvlOhKCvIzpbY29nLHFISbxFCdn/7/e5vv3/UQllp6aPXQtHff7eefr6EEEVxicU/kI3y8/OkHQGAfa6eOrP/p60PU9P0Oh0hRK1WE0J8g4IIIeeP/+Hp63Pz4iVCiJunB8MwqTdvEUI+WrTk8RYeXQCFYpHhhYePD4/Hww7pJufsLBMIBLRTQKPpdLoj23f+sXd/cd7fa5GXlZY6ODk6uTjnZWZeO3ve2c016WoCIcTV0zPzXlpNtbKmWrlk3OTHGymXy2XOToQQofDvL5qXny8hRF6ML1pTyGRSkQidYQCWqVCU7dn4w4UTfz7anEZRLK/rcnr76jVCyI3zlwyLbj2irKwyvMBdm/H4fJ6zsyPtFADQgFr7nIQQv+CgvIeZf+3d37bLUxf/+LPBPqfhBW7umszV1YkQHu0UZoQ+9N9USuVHi5ZoNZqxs2f6BgfG/7bv1uWrOq02olXM5Bfm71j/3TvPPU8IEYrFs1csFQqFOp2WEDJ96X+8/f0fNWK4x3sCj8cz7OZn2Q9ku3Jzi2hHAGCZ1Ju3vnjlDe9A/5kvL3Fydflp1ReGPsfAcaOTr12/+Mefhg6Hi4f75BfnMwyj0+pEEslLH7//eCOGx29P4PF42KvB5BSKikdz0IBFDv68ffe337fuFDdx/nM6nfa791dqtTqhSLTow3c/WbJ8zbJXDYf1HzsqsnWrOwnXDbOtxz436/FGwlvFZNy9+/ifoJvRHJWV1RqNlnYKAGicdW+9m3QlYcC4Ua06xqXeTDryyy6dTlvX5fTmxcuEkL6jRnTs3eufJng8Z3e3J5rF5bRBOh1TXo4d5gGsXa19TkP9KvvBg1/WfffLuu8M/cwB40dnpKTW1ef8d8u4uWsUubyc239dKCj/rTA3r7KsvN+okSOnTzXsQ/WoP3Hz0mUHJ8dZy5faS6UhUZGGzkdQeFjK9UR5UbFhdXNCiLKq0t5BVv9ZhEIhIaSyrNwinwkAwCiGwTvjnpvdbdAAnU73wyerDFe/qvKK+8kpodEtxsyeaWdvHxbd0k5qTwjxDwvNTr/P4/PbPtXJ8EBOIBIJhUKttr66jGEYYGVFOZa8aCaGYQw3vcAuSZevEkLmv/maq6dHVvp9Qoihi5ly/YZWrX7mpYVefn6+QUG+wYGEkICwED6fn5eV/ajjUV1ZIZU1MC5MKBIZvrmW+UQcgG8TAOswDHP76jWfoADDsj/y4pJHd221Xk6DIsIIITkZD2ct78Tn83E5bR5cMwFYoK4+Z97DzOK8/F4jhsX17O7i4R4W3ZLH49XT56z/5k4gElWWV6ArVS+O/+WgoPw3Z3c3Pp9/5fQZF093RVHJ1VOnCSGlBYXKquo7Cde9A/3t7O0Ij1dcUGAvcxBLJIMnjjt9+OiBLdsy7qb6Bgam30n2DQ5a8NZr9Z8lMDyMELL/p605GQ9EYolhXWYAALoM2+Ud3r6zMDcv9cbNCrnCsCVpxt17iuKS8JgYgUCg02mLcnMDwsN4PN7omc+uffPdz5e92rl/H7GdXdLlq6NnTe89Ylj9ZwkID7tx4dK3734UFtMypEVkv1FPW+rzcQ36bSzlExiQfC3xp9VfBkdFXDwRTwgxbGV57ex5Ho/n6ukhEArL5KUOzo5OLi5Orq59R4+M/23/azPmxPXqoa6puXb2wvs/fFvrVIBHvAMCRCLRtTNnN370aVlpqWHBO6gHvk0ArMPj8bwC/Aqycn7+cq1ILPpz3wFCSGlhYV2X0w7duwVGhKfeuPXGzLkt27UtLijIeZDxyS9bDAN96oLLaa0YhuH29G0Abqirz3nzwmWdVufu7SUUiWqqqovz8j39fJvW5ySEBIaF3k9OWbP8NQdHx4Hjx4THRFvq87EG52e8oKD8NycXl+deX7Hvx58O/vxLUHj43NdX/LL2m6Sr17oNHth/7KjTh46uWvrK/x/p/Po3X/uHBL/xzVe7vtmQevNW6s1bnj4+wVGRDZ6lXbcugyeOP3Pk2KnfD3fs26vB4wEALKDLgH4PUu6eO3bi8LYdLTu0m7b4xd0bvk+6fLVTn54t2rW9fu58wukzhiPDolu+sf7LLgP68fi8A1u2X4r/S2Jv7xcS7OLh0eBZRkybkn3//p2ExPysrICwMPN/LADrMn7ec+Vyxe0rCem373Tu16dFu7bnjp/IzcgcNGHMtq/Wf/36O4bDhCLhvNdf6TZ44LNLFrt6eJw6ePjUgUOOLi7RHdrp9Q10S2VOjrNfWbbnu+/PHD7q6o6pAADATS+88+ZPq774c9/v7t7e4+fOOX3oSEriDZ1OV9fl9NWv1+xY913i2XN/7T/g4unRvmsXtUpVf0EZl1MAYK+6+pxdBw9IOHN27w+bHx05cNyYGS+/1IQ+JyFk1rL/bvz405sXL9k7OAyeMM7MnwmsEQrK/+g1bEivYUP++XH4UMOLEdOmTJg/V+bkyDDMn3t/3/TZmsvxf42ZMzMkKnL555/9u50JC+ZOWDD30Y+jZ00fPWv6ox+fXbLo2SWLzPxRAAAagc/nT1v84uNzJoZOmWh4MfPll/xCQwQCgVar/e79jy4cj09PTonu0P6pfn2f6tf3iXaEQuHPF049/idbzv316LWji/Oy1Z+a+aPYBIYhGFLJRs5urks++fDxP5n9ysuEEHdvz/CYVn4hQYSQnIyMFVNmnDxwqNvggUKh8IkuhEHrTh0f/6K1aNf28R97Dhvcc9hg838ajsC3CYCNwmOi3/vxu0c/Dhw/xvCic98+tV5OnVxc5r2+4t/t1H/Xhsvpv+GaCcAKdfU5CSELP3gntEUUIURZVf36jDnx+35/Zsmiuvqc9d/chUa3+GjLD+b8HKzH+ZlwKCg3oEJR9vHi/1ZVVLRs304kFhlWGg2MCKedC+rE5xPD+mgA0Expt+988MJ/3Dw8wmJa1iiVSZevSOztDOtjAEU8HuHxcJXjCIZh1r/zYeL589Ed2rt4uKfdvoNuhiXxeITbHX0A24HLqUUwuM8CYK8jO3Zv+3JtaHQL/5Dgwty8wpzc4MgIgUBAOxdn8Xg8Pp/L/UwUlBtgJ7XvN+bp88f+uHnhklAsCYoMHzRhTMfePWnngjpptTqs7QVgEh7e3v1HP51w5uyVk6ekjo6xPXqMfHaqm6cn7Vy2TqfToQLGGTwer/uQAcqqyrTbyTqdzsvfb/IL84dQywiHAAAgAElEQVRMnkA7l63Q6fQoKANwAy6nFqDXM7hkArBXy/Ztn+rf5861xKy0dGd3j0Hjx46a+SztUFym0+m43c9EQbkBIrF4xLQpI6ZNoR0EjIWpWACm4uLhjlV6rBKP210TW9O5b5/OffvQTmGjOD8VEcCm4HJqEbhmArBVaMsWiz54l3YKG8L5jUwxYwW4BjeHAMBtDOc3DAawFPQZAACMhx4IAIDxOD/YEQVl4Bq9nuH2OjUAYONQAgMAAADLw30WAIDx9Ho9t9ed5/JnA9sUHOzL7WkFAGDj3N1duN01AbAYR0epSCSinQIAgB0EAr6LiyPtFAAA7ODp6crth3C4IwWuuX8/WyjERqUAwFkFBcUoKAOYhFxejhncAABGUqnUlZVK2ikAANghP7+E2/NKcUcKnKLT6bn9CAgAQKvVCYX49Q1gAlqtHt8mAAAj6XR6gQDXTAAAo+h0OoGAy4Md8fsAOIXz31gAAL2ewQhlAJNAtwEAwHicXw8UAMCEON/PxO8D4BTOf2MBAHChAzAVFEcAAIyn1eqwtCAAgJE4f81EHxo4BfOwAIDztFott7smABbD+Y4+AIAJ4ZoJAGA8zpenuPzZwAZptdro6DDaKQAAzCg42A+3cwAm4enpKhKJaKcAAGAHkUjg5uZCOwUAADuEhvpze14pxwvKrmIh7QgmwCc8NzHudoyiUmkyM/NopwCgz0EokHBiS1k7gUDC6V/DTZCS8gAlMCvhbyfmxPeMBNpLGNoZqMjMzMPjGQDq9IQESSW0U5gAj5AAezHtFGZUUVFVXV1DOwWALXIRCoUCLnQ6XUQ2tBtycnK6mNOlPC7/rwyS2qWUV9NOYQIFKrWzkAuVcQtQq9Xc/sYCGMlDLMpTqWinMIHs6hofOy7fmzWBRqPBhc5KCHi8ghoN7RTNVabRlqq1Upt8cqNW49sEQJ9UIFCotQqNlnaQ5ipQafjceMxYB7VaK5HgmglAgZtYVKBkfZ+TEHK3ojpQakc7hYWoVBzvZ3K5oOwoFMQ4ORSrWd81Uer07VxktFOwg1qt5fY3FsBIrZwcdHraIZqNIUTI57WU2dMOYl043zVhkU6ujnI16zv3BSpNN3dn2inoQEEZwEp0c3fOr1HTTtFcpWpNZzcn2inMSK1WY44UABWtnKQ1etbf3ZWoNFGOUheRrQyX1Gg4Xp7ickGZEPJ8mN/BvGKNnsXzOI8VlIY62LVzRkHZKCqVWiLBYEYA4i4WDvV2+y2niHaQZtmVXTQz2EfA6cE+TcD5rgmLDPdxL9fqEhQVtIM0XXa16nxJ2ZwQH9pB6EBBGcBKzAr2uSqvyFKyeHLVVXmFQqMd6eNOO4gZqdVaMSeWlARgnVZODlGO0qMFpbSDNJ2OYQ7klywI9aMdxEJUKu7Pnuf47wN3sejDVqHPX7/X1d3JRSS0t/oNFpXKGnt7O0IIj/AKalRVOn20o8PEAE/auVhDo9E4OEhppwCwCr08XeyFgh1ZhQFSiY+d+PGibGVltUxmvd8UHWHylZqb5ZWvtwiOwvDk/6VWaxwdHWingH+83jL4neSMS6XlQj7fQae1s2fHJD4+IXKNVq1n7lUqv24fSTsONTwewfRtACvxqo/rJwXyCJlUzOO5SUR6hh1Dgvg8XpFKo9XrSzXad6NDaMcxL51OZ8+SX3MA3DPR33MPQ47kFLnbS7ztxOy4RBJCCKnR6RUa7dmSsm86RHlyvcb6iEajdXLi+F0bxwvKhpryrqdi9uUWZ1TXpFXWPi/1zp30mJhwy+TJeJBTUVkVEuzvWNu/LZ1Wl56UHhUV4ikRuYqFPZ1kHbDYRWMolSrsrgPwSCdXR1878eliRWGNulD19wUwIyPH399LpLNQJ+T69eSQEH9X19pngKamZoSFBT7xtfWyEwXaS+aG+jrh6/wvSqUKI7atzTvRIaeKFSdTMxNLFF5hgZY5aVLSPS8vNy+v2ofCFRaUCIQCd3eXut5uJ+C7iISRDnaLwv3NGdPalZVVYmITAHUMwyxfvnrx4mlftYs8VlB6r1KZWlGtNNvSXVVVyrS0zHbtWtR1QErK/aioUD7fqF+3riKho1DQ2sWxtwf31w6qqqrGkhcAFPllZl9PSnce0iOvRl2kMvuqaw30NgtL+Xyeh4drg+14ikUhMvuXIgLMkNF6KZU1hHD8to37BWWD0X4edf2ntWu3L3/2aWdLrSnx09UbX331s9rTtVOn1tOnj46MDH7igLMV5Z5C0sLGvmymUlWllNrMEu8Axgiwl0wN9H70o1xefure/dHtoyxz9rt3H1zbtj9drx8xoveiRc/UckSr0LVrt8+bN4HzE4JMpbpaKZVi1LbV6e3hkrr3xNtzJ1jmdBqNduo7X2UoKkYumvb0031rOaJV6Nq120eO7BMcbCvzCpuAYRilUoUvFABdlZXVqakZEycODgz0JYQM9nYb7G3E25phw4bd57/f7TZp6NKls2o94Ky8NPliwlxLXdJZpKqqxsEBt1oA1KSkPPhw3njLnEsuL5v79uGcauXiT5a2aVPbzWOr0PfeWz9//iRvby4v9dNkVVVKBweOdzKtfQkIC1i4cKrFqsmEkKAgPzs7cXGx4tCh08uWfbZq1abCwpLHD+jRIzYw0Of06as17N+YwvJQagGoR3z8RbFYNHp0f4udkc/nCQT8khLF9u2HFy58Py+v8N/HLFw4VafTWSwS21VVVXO+a8I6GzfuJoRYsvSQn1+s1erk8rKvv/754MG/aj1m4cKpqCbXzxY6+gBW7tNPfygrq4iNjenUqY3FTnr+/HW9njl+/Py5c9dqPaBHjzhUk2tVVVWN1QUBLK+0tGz16s2EkNmzx1rspOnpWeXllYWFpe+8s1YuL6/1mLfeegHV5LrYQj/TpgvKX3659Y8/Llj4pJ6erk5OMkIIn8/Pzi7YtevowoUfbNr02+PHSKX23bq1Hz36RYYla4dZj+rqGoxQBqjV6tWbwsMDLfxbjcfj8/kCw/rmFy7cWLz4oxMnzv/7MJ1OP3Lk85YMxl5VVTWc75qwy9tvr+3Zs6OFT1pYWFpdrTTMOfjyy2379sXXelha2sPVqzdZOBuLVFUprXk1eQBuUyprLlxIbN060t/fzAOS/9ft22kFBcWEkJISxdq12ysqKms9rKiodO/ePywZjBVwqwVAxWef/WjJUrLB7dtpJSUKQkhGRu7ixR/WddjJk1dOnbpi2WjsYAuDHW23oJyamjF4cPcBA7pa+Lze3h5i8T+r9en1+vT0rL17n7wVFAqFR49uVKnUqakZFk7IatXVWPICoHaDBnUPCbH0Yql8Pk+v/3sNRB6P9+BBzqpVmz777PsnDpPJpPv2rbtx466F47GRLXRN2OLKlVuEkGXLZrdsGWrhUxcXlz4qgsjlZV9+ufX332sZpxwREdynT+cLFxItHI8tqqqq8W0CoGLz5n05OYVxca2GDetl4VNfuJBYWFhqeJ2Wlvnxx0/2SQw8Pd3y84vj4y9aNp21s4UBdwBW5aef9hFCPv54SV0b0pjP9evJhhc8Hu/OnfTly1fVelifPp2uX0/OySmwbDoWsIU1gmy0oCyXl4vFopYtwyx/ak9P18e3f7G3lyQk7Pn993W1HmxnJ7l48cbly7csGJDteG5ude5BBGCbFi/+iBBS+9JXZiYQ8B/f04ZhmKoq5bJlz9V6ZGio/6+/nrBsQPZRqdQBARYdzAW1Skq6d/DgKcPjEMuf/eHDXLVa++jHioqqzz/fXOuRcXGtunZtb8FobKJUqkJDbXpPQgDL02p1WVl5hDAREUFUtk84ffrqo9cMw5w/f62uSR7PPz+5f/8uFozGAm5uTo6OtWwsDwDmsHHj7vBwC+32/ISaGnVubhHv/7cC5/F4Z84kfP31tloPfuml6Raea8IKarXaz8+LdgrzssWCcmpqxooVqy0/Uu8Rd3dnvV7PMIyzs+yZZ0bWf/D06aNyc2tZdRRqlZNTYG8voZ0CwIocPXpmyZLpVCP8s3RPQsKes2d/rus4JydZXFzM0qWfWioYKxUWlj7q2wFFDMO8++5CWmfPysp//Edvb4+AAJ96jh87dnF5ee3Tum1ZSYlCo9EacSAAmMbJk5eTk9M9Pd1nzhxDJUBSUlphYcnjv0bLyio3bNhV1/H37j3E9KnH3biR6uLiSDsFAPclJNwmhEyYMLhHjzgqAdLSHlZVVT/6kWEYtVpT63w4g5Mnrxw7ds5S6dihqKiUz+d4xZXjH+/f9Hp9VZVyw4Z3KWbw8/MSCgUJCXvi4zc9fJh3/vz1+o837KC1ffshSwVkMbm83PKTQQCsWY8ecaGhAbTOHhzsr9PpHBzsIyND5s+fuG3bwfqPDwnxX716uaXSsZJCUYF7OboWLHiH1pD/R7Ky8vR6vbOzo6en67x5Ew8d+mbr1k/qOX7jxvd++QW9iCehzwBgSUlJ906dutKmTZSdndiIw82ideuIoiK5vb2du7uLVGp39eru69d/O3z4u7qOj4wM/vzznzIyciwb00pVVlbb2YmFQiHtIAAcd/z4OcN6ZS4u1HoprVtHFhQUCwR8d3cXiUSUkLAnIWHPiRO1rxFkWPji2LGzKSn3LRvTqsnl5RT/D1qGzRWUk5LuRUeH083w5pvPX77895Pwd9558csvtxrzroiIoD17jps3GfspFOWurs60UwBYhRkzXn34MJf6llNeXu6nTm355ZfPZs0ae+5cgjFvSU3N2LJlv/mjsZJcXoarHC16vX7//j9femkG7SBk8+aPr137NT7+x82bP87IyG7weHd3l/nzJ1kkGpsoFOV4PANgAWVlldnZ+Y6ODm+//SLtLCQhYc+ZM1uPHdvYtq1RzwXXrXtTp9ObPxcLyOVleAgHYAFisWjhwmm0U5CrV/dcurTz2LGNfft2yclpeNL8mjUrqCwqa7VsYRiQbRWUP/zwu4KCEopPxf9NLBbNnj123brtDR7ZuXObrl3b1dSoLJKLreRy3BwCEELIzZup69e/FRzsRzsI2bTpI8MLoVDQsmX4b781vEpyVFRI167t9+//0/zp2EehqMDtHBU5OQUZGbnDh/ey/BZ89fD2ds/JKXz4MNeYg9et27537x/mD8UacjkeQgOYXUmJYvr0Fd7eHtbQJ3lcYaG8sLCkwcMcHOwlEpFcXm6RUFZNoajANRPAfNRqzcKFHxBC+vTpTDvL/xCLhXfvPjDmyKSkeydOnDd/InawhZlwNlRQLiurnDdvwsCB3WgHedLgwT0SE1Pu3XvY4JH+/t4SiXjAgNkqldoi0djHz8/TzQ0dHbB1iYkpQUG+VrgN99ixA4ysZ0VGBo8c2UerxfKmT5JK7bH1qOVVVyuXL18VFhZghVN9u3XrcOuWUUt8vvjiVJlMWlRUav5Q7CAQ8L283GinAOCy4mJ5QUHJ/v3rRCKru3hGRYUoFEaViQMCfMaP/w/GKVdUVEdE0NkfDMAWbN68j+L+HPWIjg7PzS0w5sjWrSMvX77Z4JquNkImk3K+NmUrBeWaGlVWVp6np5XeNixZMv3nnw8YcySPxzt6dOPZswmoKf9bfn5xcbGCdgoAyj76aENhYYl1DtUPCPDx9fVMTEwx5mA+n//HHxdWrqxzrS7blJCQhBKY5RUUlGzb9hntFLULDw9MTDR2z6iBA7tZbV/I8m7fTsPjGQDzefPNrwghMTGUFxusS2VlVXW1sVM/t2xZaeSjOw7LysoTiUS0UwBw0MWLNwgh8+ZNcHe3xm6Jm5vzgwfGLiX/+usLunXrYOZE7HDlyi1vb3faKczLVgrK8+e/4+joQDtFnWJiIioqqq5du2PMwUKhoH//ruXllX/+ecn80dgkP7/Ix8eTdgoAmiorqxcvfmbQoO60g9SpR4/YP/+8aOTBQ4b0nDJlWFLSPTOHYg2lskal0ljn0wKuSk/P2r79EMWdLRsUHh6Uk2PUsBGDjIycOXPeMGci1sjPL/bx8aCdAoCbbt68O2hQdw8PV9pB6uTm5mz8RCh/f+/WrSP1epsepIxrJoA53L+fdfDgSdop6uPl5apU1hh/fHGxHAtflJVVikRCqdTqZgyblk0UlB8+zP3oo5esbd2uJ0ybNmL37mPGH+/p6Xb2bIKRE7VsRF5esa8vejlguzIz8+7cSae+C1/9unXr0KhpUMHBfk5OsrS0hhcFsgW4l7Ow9PSsM2euTp06nHaQ+gQEeKelZRp/fEiI/3vvLUpIuG3OUCyAxzMAZlJZWX3lyq2oqJCePeNoZ6mPQlHB5zfiXlin0w8dOt+ciawdOiEA5iCXl3/wwX9op6iPvb19YWEjFkzz8HCVy8u3bz9kzlDWzkYGO3K/oKzVah0c7P39vWkHaUBcXKsHD7Lz84uNf8tbb72g1eqMHNdsC/Lyinx9uf+lBahVdnb+669/0blzG9pBGuDh4erp6WbkHmIGQUG+N26kbt6815y52KGgoAT3cpYUHh44c+YY2ikaIBIJo6JCSkoaseKTv7+3v7+3MbtRcRgqIwDmoNPpp0xZ2rFjazs7Ce0sDXB2ljVqq3aJRPzDDx/Y8p1XQUEJ56dvA1jS5cu3Tpw4HxfXinaQBtjbi11cGre53MSJQ6x8QIa52Ug/k/sF5cWLP2rU4xSK+vfv0thVLDw8XB0dHT744FuzhWKTyspqKx+HDmA+7u4uW7d+QjuFUQICvBs74njcuIHWX9ezgOJieWRkEO0UNiE/v/i5596kncJYCkV5TY2xK4Ea+Ph4rF27/fTpq2YLZe0KCkpatbLSpV0B2EuhKD9w4Bsej0c7SMOSk+83dhX1gADv8PAgm93JRiwW+flZ+yAtALa4cCGxqKh04MButIM0TKXSlJaWNeGNH374nc1uZ1paWhYezv27No4XlNPSMp999mmr3QviCb17d0xNfdDYd0VGBg8Y0NU8iVjm+vXkoCBf2ikAKLhwIbGoSE47hbFat47Ky2vEbIxHjh8/d/jwaTMkYo3k5Pve3tx/1k0dwzB//XXp++/fpx3EWFFRIY0tKBNC3ntvUWRksEZj7BKiHHPv3kNXV45vvQ1gScXF8tWrN1vnjlK18vb2aEJaZ2fZiBHP2+CVMzu7oLKyqlFjugGgHl27th8+vDftFEbR6ZiwMP8mvHHKlOGffmqj+6vfvp3m58f92fMcLyhHRAR17dqedgpjRUaGnDrVlLFCXbq0I4SsWbPZ+J0luEev1z98mGvN+yYBmMmZMwmHD59m0dMUgUCQn1/UhDcOGtTdzc3ZsA+ybbp/Pys8PJB2Cu6rqlJOnDiUdopGSE6+L5M1ZedhX1/Pw4dP2+Y2U/g2AZhQZWX1jh1Hli6dSTuIse7fz6qoqBKJhE147y+/rLp6NckMoaza/ftZYWG4ZgKYwPXrya+//gXtFI2QnZ2n0zFNeGNYWMCrr84zQyIWePAg2xaumVwuKP/556XNm/fRTtEIPB6va9d2KSn3m/b2efMmLl++2tShWOP+/aywMFSTwRZFRga///5i2ikawcXFUa1u4tOvLl3adenSzmZryunpuJ0zuy++2PLHHxcEAjZ1kJTKGienphSUCSHdu3eYNOm/pk7EAvg2AZiQTCZduHAq7RSNcPduRosWIU17r4eHa2xsjFqtMXUoq4aCMoBJVFZW37374MMPX6IdpBFycwv9/b2a/PYdOw7b4Orz9+9n20J5ik33S43166/HhwzpQTtF4zAMUSgqmvZemUy6Zs0KQkhycrqpc7HAw4d5bdu2oJ0CwNIKC0vEYhHtFI2jVmuaP51i//4/TRSHNXJyClq3jmTd/252yczMa9MmavTo/rSDNEJpaZlAILC3t2va2z08XHfv/sLWVrjTanUCAZ9FEzsArNmSJSuzswtop2icxMTkuLiYJr9dIhEvXfpJUtI9k4ayasXF8qioYNopAFhPJpNOnjyMdorGSU19GBnZ9K//5MnDDh8+nZPDsl8TzXH/fnbnzm1op7AELheU1617k3X7KtrZiZuwDOITjh07d/PmXRMlYo0bN1JCQrAjH9iWysrq2bPfcHNj2TKg5eWVTk6y5rTQpUu7pk1TZbVbt1K9vLC7unkFBfn279+FdorGSU1t+ji7RzIycuLjL5ooEQskJaU6Ozduv3IAqNXhw6dnzx4bEMCyvdrOnEno1q1Dc1r4+us3EhNTTJfI2p0/n9iyZRjtFADstnbt9qNHz9JO0WjXryd36BDdnBbeeGOBvz/Lfk00x61bqawrRTYNZwvKSUn3CgpKaKdoNB8fD72+KcvTPO6ll6ZnZuaZKBFr3Lhxt127lrRTAFjU3bsPVq1aTjtFo2m12uZMmzIYNqwXIWT37mMmCsUCiYl327fHPAwz2rJlPxtHvufkFLRv36xePiEkPDywsrL6xx9/M1Eoa4dvE4CpDBvWq02bKNopGicxMaVPn85SqX0z23nmmZEmSmTtMjPzZDIp60YwAFiVzMw8T09X1s2hf/AgOzY2ppmDgQzTalet+tFEoazdzZt3bWT2PGcLyitWrJbJpLRTNNqNG3dN8ihjxIg+hJAffvjVFKFYoLS0rKKiKjgYI5TBtsTFtWrZMpR2ikY7cyYhKso0sWUy6dmzCSZpyvolJiY3v24IdamqUubnF48a1Y92kEY7ePDkU0+1bX47o0b1mz17rCkSsUBiYkr79ngIDdBcs2a93vy5lZa3c+eRnj3jTNLUgQN/rV27zSRNWbPExGRcMwGaKSjId9IkNu35bPD773+1a2eC2qiXl/uQIT3Xr//FFKGsne30M7lZUJbLy1esmOPg0NzHzpaXmZlnwhX9QkL8jxw5Y6rWrNnNm3djY1FnAdty7dqdNWs2007RaHq9/tateybplxBChg7tKZM5aLU6k7RmzQoKSoRCgY1MnqLCwcF++fI5tFM0WlpapkAgCAnxN1WDhw+fvn492VStWa3k5HQ8ngFopuPHz02bNsLOTkI7SONkZeVlZ+d37dreJK2NHNm3Q4foO3c4voFNUlIaZoICNMeVK7e+/vpn2ima4sCBv0aO7GuSplq3jnzhhSkmacqaZWcXODvL3N1daAexBG4WlF1dnXr16kQ7RaNlZ+eHhQWYcGB1//5dWreOMFVr1uzSpZtxca1ppwCwqDNnEkJD2bd17MWLN0w71at9+5Y8Hu+jjzaYsE0rdPnyzQ4dmr59EDRo27YDVVVK2ika7eTJy6YdVT1sWK8bN1K4vRn3lSu3bGQeIoBZDRrUfcCArrRTNNrvv5+cMWO0CRvs3j02JibchA1aoQsXEm1kgykAM9m580jXrs1at52KgwdPTp48TCgUmLZNbo96vHTpZjOXnGYRbhaUT5w4f+rUFdopGu3o0bOdO5tg1urjAgN9CSFjxizSarWmbdmqnDmTYKqZawBssXDhtDFjBtBO0Wg7dx4ZPNjEa4cJBPyOHVslJnJ5WOXZs9d69IilnYKzMjJyTpy4wLqJTaWlZYcPnzbVsJFHZs4cExvL5acX+DYBNN+DB9kXL96gnaLRbt9Ou3MnzRx18OnTX+HqHjaJiSnh4YGOjg60gwCw2KpVyzt2bEU7RaN98cXWmTNN+QTu0eqsHJ4PZ1O1KW4WlG/fTlMoymmnaLTLl2+ZaXP5rVs/2bPnuE6nN0fj1CUmpkREBLFxyWyA5sjPL6IdodHS0jIZhjHVehePGzSou6+v18OHuSZv2Rqo1Zrr15NNsk4u1MrOTvL22y/QTtFoW7bsnzNnnJka/+yzH2/dSjVT43ShoAzQfDt2HC4rq6SdotHWrftlyZIZ5mh5y5aVN2/e5eQIHlwzAZqprKwyP7+YdopG+/HHX+fMGSsUCk3e8tChPbk6hreysjo9/aHtrKvGzYLy1KnD+/dn2Qys+PiLgYE+vr6e5mhcJpNOnjxMqay5evW2OdqnKyHhdt++T9FOAWBRKpV69uw3aKdotA0bdj3zzNNmatzb293d3WXhwg/M1D5FJl8nBJ7g4+PBugVkUlMz7txJHz68t5naX7Zs9sOHubm5hWZqn5bk5PTo6DAbWdgOwHwGDeret29n2ikaZ/v2Qx06tIyICDJT+8OH987OLjBT4xSlpj7o0cNWRtsBmMOuXUfi4y/STtE4mZl5J09emTJluPlO8eabX126dNN87VNx6dINk0/GtWbcLCh7ebmzbrzqd9/tmj9/ollPIZNJd+w4lJfHvlGN9fv99z/79GFZjxagmTQaLetu5E6fvmpnJzHrGnwymXTBgkncu507cuRMt27sW3aNRVav3nzmTALtFI3z/vvfvPXW82Y9xYgRffz8vG7fTjPrWSzs2LFzHTti0wWA5oqLayUWi2inaIS0tId//nlx7twJ5jsFj8dzcXEcM2aR+U5hebdupRLCw57AAM3h6+vFuomGb7319Xvvmfdq9v77i/V6fVFRqVnPYmGHD58x1aavrMDNgvLWrb9v23aAdopGOHz49LBhvby83M19olWrlisU5VwacHThQmJMTISzs4x2EACLksmkr7wyl3aKxtm3L/6ll6ab+yytW0d6ebnt3x9v7hNZjEJRkZyc3qVLO9pBuKy8vJLPZ1OPaMeOw08/3S8gwMcC59q794+CghILnMgyjh49M2RIT9opANitpETBuvlA33yz89NPXzb3WVxcnHbuXHP/fpa5T2Qxhw+fHjoU10yAZhkxorf55kaYwy+/HBoxok9IiL+5T9S1a3s3N5ebN++a+0SWUVBQkpNTEBfHvsWym4xNt0/G8/X1LC0to53CWCUlih9++NXki53XJTo6XK/Xf/HFFsucztyOHDmDXg7YpkOHTtGO0Agff7xh8OAebm7OFjiXWCwaMKDbM88st8C5LODoUVzlzO7ddxd2786aMeAJCbevXEmaMGGwZU73xhsLkpPvV1RUWeZ0ZhUff7F791g7OzHtIADsptXqiovltFM0wquvfj5+/CCLdUI8Pd2OHDljga4GVfAAACAASURBVHNZwNGjZ9EJAWim48fPsWhI34ULiTdvpo4fP8gypxMI+N9/v0ehqLDM6czqyJHTtjZqgZsF5QEDui5a9AztFMZ6++2177670JJnDAjwCQ0NqKystuRJzaG8vKqkRNGrV0faQQAo2LnzCFv2Ez9+/BzDkMGDu1vsjA4O9j/++KFheK/FTmom167dGTbMXOvkgoFCUV5VpaSdwig1NerXX/9i9WqLPi/p06eTQlEeH3/Bkic1h0uXbowc2Zd2CgDW8/Jy27r1E9opjLVp028tW4Zacg6yo6NDmzZRK1asttgZzeTYsbOTJw+lnQKA9S5fvnXnTjrtFEYpKVGsWrXp44+XWPKkX331elZWHrueU9YqMTFlxAjbumvjZkGZEHLlyi3aEYzy228nunVr37p1pIXPO2pUPzs78U8/7bPweU1r164jNrVCDcDjXnhhikhk+l13Ta64WH7ixPnXXptn4fMa1nZcuXIjiyas/Nu5c9dFImFgoCVWNrBl8fGXfvzxV9opjPLRR9+tX/+25c8bGOh78eINtVpj+VObSlraw/v3s9u3b0k7CADr8Xi8/Hx2bMpy/vz18vKqGTMsNBP0kYAA708+WarRaC18XtPatu3g0KG9aKcAYL1Jk4a2aBFCO4VRVq7cuGHDu5Y/b5s2UWlpmdeu3bH8qU0lPv6iu7uLh4cr7SAWxdmC8q5dR69eTaKdogGXL9+6ePHG1KkjqJxdKBQOHtxj5cqNVM5uEnv2HBs/3kJzfgGsTZcu7Xx9PWmnaNjMma+9/faLtM6+cuV/z59P1GrZeke3a9eRiRMxOMjsOnRoKZXa007RsDfe+HLQoO5hYQFUzv766wu0Wq31d67qsmvXsYkTh9BOAcAR7767Pi3tIe0UDUhKurd9+6H//OdZWgGEQsG0actonb2ZLl684e/vFRTkSzsIAOtFRgYHBrLgq/T88+/Nnj3W3d2Fytm7dGkXH3+RyqlNYs+eYxZbj856cLagPHXqiOrqGtop6lNeXrl27TYL7A5RDx8fj4ULpxFC2Li35tGjZyZMGIKVEMFm5ecXW/9i6MuWrVq5cqlMJqWYYcSI3jU16uPHz1HM0DSpqRkeHi7t2rWgHYT7wsIC58wZRztFAzZu3N2+fXSPHrEUM0il9kqlavfuYxQzNI1cXl5Sohg0yHIL7wBw25AhPYuLFbRT1KegoGTNms1r175BMQOPx/v005fPnbtOMUOTnTp1ZdKkYbRTAHBBaWnZq69+TjtFAz7+eMPUqcOio8MpZli2bDYh5OrV2xQzNE1S0r2gIN+WLcNoB7E0zhaUO3SItvKldadOXbZ+/Vu0UxBDoWfVqk05OQW0szTOd9/tGjduIO0UANT4+HhcvZqUnW2939z33ls/dGjP1q0jaAchMpn06tXbbFly+pEff/xt4EDUvyzk5MnL1rxfyvbtBxmGWGyDlHr07BnHxke5P/20z8q7hQDsMn78oC5d2tFOUafKyuo5c94w7KZAl7+/d8eOrW7evKvT6WlnaYQbN+7m5hZijSAAk3Bzc87PL8rIyKEdpE5r1mxu1SqiZ0+r6CllZ+efOnWFdorG+eGHX21tOz4DzhaUDTcP9+9n0U5Ru+nTX1m//i26o/Ye98knSy9cSKypUdMOYqx9++J79+7k4uJEOwgATd9//77VrtP0xRdb2rZt0a/fU7SD/O211+ap1eoHD7JpBzHWnTvp5eWV1ny7zjE6nW7Tpt9op6jdzz//XlBQMm/eBNpB/mbY127jxt20gxirtLTs9Omro0b1ox0EgFN+/vkA7Qi102p1o0cvPHjwG9pB/iaRiFu3jhw0aA7DMLSzGGvTpr2zZo2hnQKAO3744YPgYD/aKWr35Zdb/Py8nn7aWrpJo0f3VypVtFM0QmJiCsMwHTpE0w5CAZcLyiEh/j/9tJ92ilp8+OF3y5bNtrYVqcaPH6xWq/fsYcc81h07DlvPrTUALQKBwDp3xfnll0P+/l6jR/enHeR/REQEi0SiDz/8lnYQo2zbdmDu3PG0U9iQ/v27jhljjbNedu8+6uHhumTJDNpBntSpU5vDh0/TTmGUn37aN2/eRNopALgmN7fQOi8CL774/okTP9BO8T/4fH58/KaKiqqSEqteJ8QgIeG2o6O0fXtbLI4AmIlWq0tNzaCdoharV2/u0+epyZOta32bIUN6EEI+++xH2kGMsm3bweees9HaFJcLyr17d1qyZLpeb13Ti156aeXw4b3atImiHaQWTk6y/PySlJQHtIM0YNOm34YO7cmKDZQAzEokEm7evPfkSeuaE7Rp029qtWbCBGvc/CogwLtLl3aVldW0gzTg4sUber2+Q4cY2kFsS0wMzWXjanX8+Ll79zKtcw5d+/Yt4+Ja0U7RsIyMnJs37xruTADAhJ5/fnJAgDftFP9Dr9f37Tvzm2/e4vF4tLPUwslJFh9/MTExmXaQBqxf/8usWWNppwDgFLFYtHXr72fPXqMd5H98//0eT09Xq92vpW3bFn/8cYF2igacPn3V3l5iDWs8UsHlgjIhRKfTFxaWEkJGjVrYt+9M2nHIyy9/umzZLGt+3rtw4VSJRPTEesoTJ/6XXqInVVRU3b6dNmPGaNpBAKzC889PKS4uJYSMHbu4Y0f6j0Y/+ODbtm1bWPM3tH//rnZ24qNHzzz+h5MnL6WXqBaHD59+4YWptFPYHIWi4tlnVxBCJk5c0rnzpM8//4lungMH/rp1695rr82jG6Me3t7uhJBRo17UarW0s9Tp119PLFr0DO0UABzk6OgQHR1uWMahf//ZY8cuppunsrL6pZdW/vXXZj7feu9wJ04c8uuvJ2inqM+ff17s3LltWFgA7SAAXLNw4TSFooIQMmbMImu4a/v225329pLp00fRDlKnwYO7x8SEy+Xlj//hpEnWddd29OiZF1+03bs26/11axLu7i7jxv0nLm58Tk6BQMC7du0OxTCLFn2wYMEkf3/repL/b6GhATKZw9tvf234ccyYRbm5RdbzaGjNmp+GD+9NOwWAtfD2dt+yZX9s7LjMzDy9Xr9vXzzFMF9//XN4eKD1D1oUCoXdunV44YX3DD9On/5KamrG+vW/0M71t927j3l7uwcG+tAOYnNcXBwVivJu3abev5+t1Wr5fJoD3PbsOZ6a+nDpUvrPwhu0c+ea+PhLjwb+x8WNf+ONL2mH+tuZMwkKRXlsLAb7A5jF7dtp3bpN6dhxgkJRTneB4MLCksmTl3711WsUMxjp/fcXE0L++uuS4ce+fWeOHbu4tLSMdq6/ffzxxtmzMTwZwPR8fDw2bNjZseOErKx8hmEOHjxJMcyaNZs9PV2nTRtJMYMx/Py8HB2lY8YsMvw4ZcrL6emZGzbsop3rb9u2HQwNDTAMsLBNXC4ojxmzODZ2nEqlNkx6EolEQqGAVpg33vhy/vxJERHBtAI0irOzbMyYgWq1ZvLkpVlZ+Uql8tAhmte7R27cuCuV2vXtay3bfAHQNXz4gtjYcbm5RYbBODwezSmea9Zs9vPznjJlOLUEjeHkJFuzZgUhJD+/OCnpHp/PP3nyslqtoZ2L1NSozp+/bssPummZNm159+7T8vKKDf8MeDwen0+tz7B5897iYjkrqsmEEDs7yeDB3VNS7l/9v/buO77Jan/g+MlOmm7a0pYuWlr2ll0UEBCUjYiCCoIoKjivot6rIgpyXdefE/S6B+JGFFEREZUlsqGMltFN98wevz/CRYRSArR5nrSf9x+8aJo8+T5Jz8n5fnOec7bu6d17skKh2Lcv67RLnaSycuU6f3kZAb+Tnj51xox/2u1Oz496vVaqSLKycp5//l357MLnDYvFtn791qFDZ1RX1+bnF8tk+bJ33/3qnnumaTRqqQMBmporr7y1Z8+r8/NPbH6j0ajVaska2jPPvNWmTcLEicOlCuC8qNXqt95amJ9fVFNjOnTomBBizZpNUgclhBCVlTU7dmTMmiX9ZHMJNdmC8p13LiwpKTv1iielUinVBVCTJ987ffq4Tp1SJXn2C9OtW7tZsx7JzMz25NWZmTlZWdlSByUWLlw6bZp8L8oAfMxms5/WrWk0Gkki+fe//9uhQ8rEiXLc0+xs9HqdEGL06Ns9r2F+ftEvv0ifzi1e/MbYsfLazLCZsNlsVqvt1Fuk+hL67be/FEIxe/ZkSZ79gl1ySafbbnvcs3FFXl7R2rXSj/WXLPm4R4/2oaHBUgcCNEE33PCg1fq3QYhKJU2fuXv3oUWLli5eLKMF+rwxcuTAJ5541XP9u8PhWLlyrdQRiQMHjmzYsOPKKy+VOhCgCbLZ7KdO/JFw1sKCBa/27NlxzJghUgVwAcLCgmNjowYPPrFDdXFx6bp1W6QOSixe/MaECf6U/DaGJltQfvHFf86aNSkyMvzkLZJch2W12saOvePFF//pL3OTT7V3b9bJ/xcUSL/qxXvvrZg4cVhUVPO9oAA4zaOP3p6QEHOyc1OplJJsQ3rXXYsGDOghz33D6jdgwNSTr57FYlu58mdp49m0aadKpR40qJe0YTRPCxbMbd8++WRxxO2W5urthQuXRkSETZ8u31XIz2bgwBtOvmIOh+OHHzZIG8+RI7kZGYfltms50GS8//7iq666NDjYePIWh8Pp+zDWr9+6cuXPb775pO+f+iKNGnVbeXn1yR/z8op37TogaUTilVeWPfTQLGljAJqqefNujotreXJw6Xa7JJnrePvtC0aOHDhkiP9d8J2efv3JcWZVVa20C4Z4Pn1CQoL69esmbRiSa7IFZSHEjTeOXbToroSEmJO3+Hg9xKKi0jFj7li+/D/+uKhKnz7XnlaZkragnJ1dsGbNxsmTR0oYAyA3Awf2XLp0fo8eHTzTgtRqle8vnrrppodnzJiYnt7Dx8978QYOvMFstpx6S1ZWTlZWjnQRiSefXHLPPTdKGEBz1r59yltvPTl0aL/AQKPn0hyDQefjGObNe653786jRw/y8fNevPT0qSaT+eSPCoUiJ6dwy5ZdEoa0cOHSuXOnShgA0OTNnz/n5puvjomJ9LR631/VsXLlujVrNsp559KzmTjxrsLCklNTrdLS8m+++UXCkD766JtOnVKTklpJGAPQhA0b1v+11+Z37pzmydoUCqFS+Tprmzr1/jlzpvbq1dnHz3vxTsvaFApFZmZ2dna+hCEtXLiUrK2JF5SFEN27d1i27Nn+/bt5LgP3ZaklM/PYnDkLv//+DQkXFLsYM2dO7NGjQ3x8tF6v99xSUlJxcvsI33vqqTf4zhw4U2Rk+BtvLJg4cVhoaKDvu/Tx4+c+9tgdXbu29fHzNogZMyb069ctNjbKaDR45gsUFpb88MPvUsXz0ksf3HTThMDAAKkCgEajWbTo7mnTxkZGhrndbh9/CT179vwJE4YNG9bfl0/aUH777cP09B6xsZFGY4Db7Xa73TU1Jgknj3z66ffdu7f3x4vDAP8yZcqo+fPvSE1NdDqdPv5K+4MPVh48eGTBgrm+fNKG8vnn/zdlylVdu7aNj4/R6bRut1uhUGzfnuFZAcP3SkrKv/jix1tuadYrgQKNLSYm4p13Fo0ZMzg42KhSqXw5znQ4nKNH3/b00//o0CHFZ0/agG6+eWKfPl3i4loGBxs9WVt+fvGPP0p2Mdxzz70zZ84Unc4vC30Ny19X3He53buravPMNpPz3FdXXTrvFs3GHfv2Za0123flFfsgPKvV9vU3629+6V/7a0ztZFwdyLVYj9RYSu0OxxmXyYdfNWjEVYOqq2uPHy85dqzg+PESk8n8+u7M0rRk38e5Z88h7WW99wYH722Ity9IrU4I0LUPku/7AnhkVNfmmK1V9nP3cslTxwzq2XH9+j+36XQ1PunlhBBLliy/7bkHq8ND3UJItxfgOdQ6nfuqTIUWm/WMXi5oePrg4elms6W4uCw7uzA3t9Bisa4sqYz01Qt4qqKisg1uxcS+3T5riGcPUKliDdouwUalhLs0ysyBalOO2Vphd5zznkHD04e1S/7hhw0ZoSEN8nZ447PPvh8+e4o2Kdbpdqvk+q453O7dlbUFlrqHXgMfmNXHZi8qKs3JKczNLTSZLFvsTp+9gKeyWKwfHzw2ffr4Bnl2g0oVrdd0DDbqJdqHA5BEkdWeWWMqsdpt51z9Jzpq1MJ7lKt+qbbYfNbkDx/OLTUYRk2bkGuyxgX4+moS7+2vNmWbLFV1LQaSMPmqhMlXlZRUHD9ekp2dX1JSUWEyP7dhR9eu7Xwf58qVP6ffP6tB3j61QhGm1SQYdK2N+oYIDfAPGdWmbJOl2ouVf9pOn+Do1WXjxu1bVKpSX/WZS5cun/PKYxVBgXK+BqHS7jhQYy6y2CxnZG3GoQOGDB1gNluLikqzswvy84vMZsuXhWUhUowzCwuLt+v1iT06NUifaVQpYw26ziGBfjrK9MuCco7Z+vSBbLVSGW/Q2b1b4zCqS7uoLu0qhaistXhx94tVXV3b48pBWytqvj9eblSr5rdPlGFW/1FO0b7qWpdbROo0NtdZXkalShHTMimmZZIQVVU1wcGBWT55AU+jT4qPbS0a8KlXHS9VKxRPdkgySLR/CFA/i9P1SMZRh8sVrdMK73oPTXyry6e2EqIhW0o9nE7nsBvGHnW6fj1S4BTisXaJETpp9gOsxx/l1R/lHFcrFK0M+jMLyn9pER7TIjymeweTyRwQYJCkl3MHBHQbcWlDPbVWoVhTVG5zue5Pi08MIKMTT+4/Vu1wBqiUei/7/Bbhg68b5bPWJIRoP7hvVYD+g5zj1qOue1Pjk+WXh2fVml/IzNUqlfEBeqvz7K0pLDQqLDSqSzuLxarX6yRpTUKIgZNGNtRT61SKdcU2m+v4TYnR3UIDG+SYgMx9mV/yR3m1y+2O1uvq+/Q8RZfhA33ZZ5ZrtNFd260+XqZUKDoEB1wXF+Wb5/We0+2en3HM5HS20Grq+5rQoNcnxaUlxaU4nJVV1YHhoZJ0mx2G9LM20NunVinLy6vdQiQE6G5tHdsQ0QGyZnI6/7n3iMstYvXeZm36pLjBSXE+6zMddsfQ68dmWh1rj+QLoXi8Q1KoRnZlwF9LKlcWlAihiDXU+7kTER4bER7bo2NtrdlolChrCwzqPGxAQz21RiF+KKpwuAsfTItv5fPV9i6e7P6SzumYyfJCZu7EVpF6lYyL+BEhJ/+bWWuet/fIM50kmNhbj8/yig/WmEZFn8/izqecVBOQZ7Y+vPfo4k6tdcw5gszY3e6H9h7uHx4c7xelwBai0uFcdCD7obT4SDmt8LOtonp5btHk80sym04v108Ii9P1QlbuncmtWhsNUocjpYf3Hk4xGoaesnOUbPUVwu5yv3w47/bk2DZyetcya82vHs6fHBellt+3474QJoQQH+QcVypElxBqymji1hSVbymvHhsj7w1gTslKVh8v+yK/eEJspKQBne6BPYd7hAamnFdPHh3uxZ38xvqSyjeOFMxqHePFfQF/ZXG6/rn3yMCIkFi9H5QCB7QILrPZH884+mi7xDCtjGYCbSmrWlFQcnWr8+nGm1Ztyux0PXso577U+Dh/qyn7WSnN5XYvPJA9LjZC1tXkv2tjNHQONr6QKeVGT6f5vbRyT1XtiJZNatRyvloZdP3DgxYfyJY6EOB0/z6Q3S88xD+qyUIIIULUqiuiw588mO3VBSM+cdxie/No4fmNS5ocvUo5ISZywf5jLu8u5WmSlh4pSDEaOvpDNdlDo1RMahW5IOOYUzbvmtPtXpBxbFKryGZaTf6fq1tFvpyVX+k496opgP/aWVGztrhc7tXkvxvRMnx3Ze1vpZVSB/KXF7LyuoacZzW5ybk0IqTUbl9VUCp1IEAjeupAdnoL/6gme4RrNUOjwp7YL6MaSK7Z+kFOUTPP2gwq5ZiYiCf3H/PqmiA58ZuyrMeW8uoWGrXfTSlNCzT8Xlrl5SVjPrD6eFlaYLMe4njEB+iPmSzHrTapAwH+Umy1ZdaaE2S8IGCdQtQqlRB7qmqlDuSEtcUVMlw0wPc0SkWsXrehtErqQCSzurC0g/9Ukz0UQiQb9etKKqQO5IR1JRXJRn2zriX/T2ujfm2RXN4XoDGsKa5IMfrfLiNtjIYfi8qljuIEq8v1e0kFqZYQol1gwA+yeV+ABpdvseZZrH63TEELrcbudh2sMUkdyAk/FZe39p95VI3HoFKGaFR/lkuzNesF87PKbK7ZGutvLdajQ2DAkRppVhI8k8XpimZLSiGEEG0CDUdrzFJHAfzlmMma6p9TWuL0ulyTXHq5Yqsthl5OCCFErF5bYLVKHYU0jprMHYON/lgJjTHoCsxy+bKzwGyL8c+hV4OLNWgLzXLp5YDGUGl3ROv8b0XEGL3WUs/a7r51tNbcga2/hRBCROu1LrfbJeRywQ3QsLJNVj+dvxJv0Oea5ZIdlFjtflria3BxBn2ebN4XL/lZQbnG4fDTz6Qal8vqlstAJ8dsk+0m8j5mcbossrmsGPBMbDHJJik6Lw7hrnHIJfJim0NBLyeEZ6moSnszvUjf5nT56wIFblEum3et3GZ3n23n3mZG4Ral9nNv4A74r3yLVeVvV4IKIVRKRbZJLkm41eWu8c+BXGMosNocfIKgiTI7nVb//PO2u121DrmMZ4qsdiVJmxBCCIfb5XdZm/+NGAAAAAAAAAAAkqCgDAAAAAAAAADwCgVlAAAAAAAAAIBXKCgDAAAAAAAAALxCQRkAAAAAAAAA4BUKygAAAAAAAAAAr1BQBgAAAAAAAAB4hYIyAAAAAAAAAMArFJQBAAAAAAAAAF6hoAwAAAAAAAAA8AoF5XOzmMyH9uyVOormyGaxmGqqpY4CaPocDkfWvgyLySx1IM2O0+msLCuXOgo0sIJjOSWFhVJH0RyVF5dIHQKA81ZZWpaTmSV1FM1RVUWFw+GQOgoA54GsTSpkbXVq+gXlqooKu91+MUeYO+bqlx9Z0HARwSs/ffn1rGFX7ty4RepAALmz22w1lVUXc4SX/vnYYzNnV1dUNFxQOLejBw/NHj76szfelDoQ/OXiW9N3H396/7XX7/tze8MFBa/cNXbSorl3Sx0F0OyUFRdfzMOPHDh4x6jxqz76pOEigldee/zJ20eOramolDoQoBkha/NTZG1n08QLysteWTJ39ITai2u0Lhff3EqguqLC6XBKHQUgd5vX/nzHqAm7t/xxMQdxMj9FClaz2WyqlToK/KVhWpOT1iSN8hKmJwO+tmD2nKfm3nsxR3A7XQ0XDs5DeUmZ1CEAzQtZm/8iazubJl5QPrL/IEVJAE1YwbEcUzUrwwANgNYEAOclc/deIdxSRwEAfoBxJpoetdQBNKJX5z+xb+ufQog5oycIIR5Z8lLbrl1MNdUfv7x06y/rzbWm2KTEcTNu7DXoMs/9t/224cs338nNOqw3BnQf0H/K3NsDQ4JPO+b6b7/7+r0PSguOh0ZGjLx20vBJE6U4M/+w4fsfV7z3YVFOblhUZM9L0yfMnGYwBu75Y+viO+8bdcOUnRs3H8/NjWgZPWHWTX0vH+x5SO6RIx+9+Or+HTs1Wq0xKEjqMwDkbusvv372+ptCiFceXfDKowuuvmXmuJtudLvdqz5ctubLr8uKisIiIweNvmrMtOuVSqWXvVzm3n0fvvDy0UOZ+gBD9/79bvnXgxKdnB+os5dzOBzTB17eb/jlVWWVWfv2BRiNA0eNnDBzukqlEkLUVlcve3nJ1l/W26zWqJgYqc8Af9nzx9bzak15R48ue+m1jO07FApFaudO190xOyG1zWnHrCwte+vp5z3LX7Ru33b2ow+FR0ZKdH5yl7UvY/lrbxzavccYFJjWufPk22a1jI+rvzW5XK5v3l+2dsXXlSWlMUmJLhfzHAGfmjt6osvlKjiWc32/y5RK5Xu//1zPSKOeFOwkp9O5/NXXf//+B1NNbWxC/NS75nTo2V2ik5O7ipLSj19dsu23jW6XO6FN8viZ0zr1ukQI8emSN1Z/8tnI6yb/uur7msqKpLS0G++7MzE11fOoP9f/9sV/3847ejQkvIXdapH6JIBmZMvP684cZ15kbYqszXtkbY2kKc9Q7jv08pZxrYQQY6ddf83sWyJiol0u1zP3zlu7YmXr9u2GXT2hoqzs/x561JPp7dyw6fn7HyopLBx69fg2nTqu//a7/3v4kdMOWJib9/qTi2urqq+8/rr23btptFqJzswPrF7+6avznxRCDBk/Nj4l+btlnyy+8x8nk71VH37ctmunCTffZK6tffXRBUcPHhJCFOcXLLh17q5NW9r36NZn8CBTTY3UJwHIXVzrpM69LxFC9B12+TWzb/HkXctfe33ZK0t1BsOIa68xBgV99vqbX739rpe9nBDipYcfO7Rn75BxoweOuEKrp5c7q/p7uY0//KTRaSbOmtGiZdSKt9/76YuvPNtoLL7rvnVffxMRHZ0+8gqXi1ldMtIyrpX3ramsuPiJ2XN3bdrSb9jQfsOHZmzbvmjuPdVnrES57JXX/lz/a/seXYdfM0GlVASHhUl0cnKXtS/jydlzc7KyLr1qROfevXZu2vTIzNmlx497fltnaxJCfPjiK58sed3tdKVfecWZWRaAxjZm2vUKhSI4NOSa2bdcM3tWPSONelKwU/26avWqjz4OCQ8ffcN1oZEtgsNCJTozuaupql5w65xNa9Z2698nfcSwkoLCxXfet+23DZ7fWs2Wn7746rJRI4ZPmnh4X8bz/3jYajZ7ZiH8Z94/C3Jyeg8ZlNqlUy2pFuBD8ckpp40zL7I2RdbmPbK2xtOUZyj3SO+/evlnx3Pzhl09ITSihadlHtq9t3OfXvc//7QQou/QIY/OvPXLt97p0LP75/99Wwhx7zNPpXbqKIR49r55OzZsyti2vX2Pv74YN1fXCCEiYqJHXz9FH2CQ9ORkzWIyf/b6m6ERLf7xzFOesrtao978sXHsUAAAFyxJREFU07r923d67jB43Ojp/7hXCBEcFvb6E09t/21DUlrqyg8+MlVXX3fHrVddP0UIEd4yyvMlHoCziU6Ib9ut6+4tW3sOHNBv2OWer1JXf/xJUGjIY0tfMRgDbDMtd4+/5uv3Php1/RRvejm3211bU6PV6664ZmJUbKykJydr9fRyaV07CyFiExP/8exiIUSvQQPvHj956/rfhk+auHXd+iMZB7oN6Hfv04uUSuWBnbuemD1X6lPBCZExMd63pu8/+bymsuqa2beMmTbVU4z++OUlP37+xYSZN516zJqqaiFE/+HD+g4dLN2Z+YHlry612+33L3gsNilRCJGY1uaDF15et3LV2Ok3nK01lRUX//jpF6ERLRZ98JbnkqYbB/AiAz417Orx7//nRWNIsKcnFEKcbaRhs1jPloKdesDa6mohRFqXTuNnTFcoFBKdlh/48dPPi/LzJ82++bKrrhRCpF95xWMzZ6/6aHmP9P6eO0y77x7P505ladn6b787cuBgu25dl7/2uhDi4ZdfaNOxgxBi0dwKz7W8AHwgJvH0rO0ia1NkbV4ia2tUTbmgfKbDGfuFEN369/X8mNyhnVavy8k67Ha7jx44GBAU5GmxQoj23bvu2LAp5/CRUxtt6/ZtB4wY9vvqH++ecM3Q8WOvuv5agzFQolORtexDmRaT2WIy3zPx2lNvryov98whUqs1nluiYmNObqRzcNceIUT6yBESRQ00BdmZWQ67I61LF4MxQAih1etbd2i3c8Pmwpxcb3o5hUIxZe7tbz/93APX3tB36OXjZkyLjmsl3dnIVz29nOc/au2JXi4iOlqhUJQXlwghDu7cLYRIHzHcs2YCZO5srel4bt6RE8OJPp57tu/eTQiRk3nktCOMnzHt4K49Lz8yf+1XK8ZNv7HDJT2kOA+5c7vdB3ftFkIsmnvPqbdXlp3YMKrO1pS1d5/L5eoxcAALZAEyUU8+5dkg/cwU7LQjXDbqql9Xfb/m86/2bv3zyusmDxozirJynfbv2C2E+HTJfz9d8t+TN57sMz3lEs9/ImNjPOtj1FRWFRzLjklM8FSTAUjuImtTZG1eImtrVM2roKzR6zxt7+QtbrewWa0KhUKr1506YnE63UIIu+X0taVue+xffYdevuLt97565/0dGzcveGspf2FncjgcQoiUDu0n3Py3uVopHTscPXDg1Fs874Xb7T75f6VK5fN4gaZDp9Of1su5nC4hhN1m87KXGzJudGqnDive/eC3777/c/3viz96u0VUlO9OwE/U08udeWeFQnFiz6ITvRyfGv7hbK3JZrVqdX8bTngumrPZTm9NKR3aP/Px+6uWLV/z+ZeL5t5z11MLzlwzFG632+lwanS6u5964tTbW8bHnXnnk63J8+KrlIwZALmoJ586Wwp22hECg4OeeGvpuq+//eaDZW8ufjb38NEb7mFGWB2cTocQ4sb77mrZ6q/6UZ0r//yVapFnATJz8bWp07K2fy97h706zkTW1qia+Auk0aiFEDXVVZ4fE9q0EULs+3OH58eMbdvtVmtS2zQhRHxycm1VdU5mludXe/74QwiR2DbtxIH+t8CKy+XqPqDfY2+8mtKh/dEDB8uLiyU4K9mLS05SKpUFOblJaald+/Xp2q9PaucOXfv1CQyubxpRTEK8EGLL2nU+jBTwe2q1WghRU3mil4tOjNdoNJl799ptNs8eOFl7M9RabaukpHP0ckK43G5PLxffJmXOE49dNfVac20Nl0PW6eJ6uV98GCnOw3m0pjYpQoiM7SeGE7s3bxFCtE47pTW5XJ5/Q8LDrrtj9twn5wshNv6wVqIzkzWlUtkqubXdalUolZ7W1K5bl469etY/0SY6IV4IsXPjZs/aoAAkodJoaqqqPVND6smn6knBPNziRKqlUquHXT1+0ftvqjXqDT/86PMT8g8JKclCiPLiEk+f2bVfn7QuHVM6tK/nIYHBQcGhIQXHss+cGA7AN04bZ154beosWVvG/w6FU5G1NaomPkM5LiV558bNSx5flNyhXVLb1MFjRqd26rj1l/UvPPhIVGzM+lXfCSHG3ni9EGLczOnP3HP/0/c80P+KodmZWfv+3J7aqWPHS3oKIdQaTXlpaUVJaXB42KMzbk3p2D4kPKwwN1er1wWFhEh9inIUHBY2eNzon75Y8fC0mT0vTbdZLNt+2/jEm0vqnG100sjrrtmydt07z/5nx8ZNoS3Ct67/zYchA/4qPiVZCLHi3ffzjh7RaHVT77xjxHWTVr730YLZczpe0nPHbxtMNTWjbpiiDzDU28tphRDHDh5s2Sr27aefq62qSUhN2bX5DyFEVCsunqrDhfVy6SOHffHWO5vWrC3Ky09ql3Zwxy4fhoxz8741jZg8ac3nK5a99Fr+sWN2m+PXb78zBgcNmTBWCKHVaoUQxw4cEqPF76t/XPvVik69ex07eEgIEZsYL/UpytS46Te8/Mjj/7n/od6XD9Lq9Xu2bB13042XjbqynofEtW7dtV+fnRs3z5s6vXPvS4ryCk9urgLAZ+KTWx/O2P/8Aw8bg4KGXT2+npHG2VIwjVYjhDh2KMvlcpUWHl981329Bl9mt9kcdkdMYoLU5ydTV1wzcf2q1Svf+/DogYMx8fFZ+zJiEhNmP/pw/Y8aed21y19b+vgtd/S8dIDL5Tq4c6ev4gUgzhxnTpl7+wXVps6atUW2ipH6FOWIrK1RNfEZyqOmXte1f5+8o8c2fP+jw+5UKBR3/3th+sjhGdu3//DZ5y1atrx78ZNd+/cVQnTt2/uOBY8aQ4JXL//06IGDl42+6r7n/u25AKHv0CF6g2HHxk21VdVhkRG/r/5h5fvLolrF3vv0Iq1eL/UpytQN99x59S0ztTrdLyu/3fPHn+27dz3nzpipnTrOeWJ+q6TE3Zu3ZGzfEU0ZC/BC1/59r7jmapvV9svXqypKS4UQE2fNHDdjWlVp+aqPllsslkmzZ3n2Xq+nl+s16FJ9QMCezVuFEFqdfv/27V++9Y7VbL7xvrvadu0i9SnK1AX0cgZj4LwXnu3Qs3t21uFtv/7eIrqlr4KFV7xvTSHhYQ+//Hxq1y7rv129ac1PHXv1/NerL3ouM+zY65Lg0JCsfRkWk1kIV2VZ+Yp33s/alzF04tgx02+Q+hRlqu/QIXMXzm+V3HrzTz9vWbsuNKJFaETEOR81+9F/DhgxzFRds/mndfoAA2utAr530/33JqS22bVp844NG5UK5dlGGvWkYDGJCYlpqTUVlTlZWVUVFcagoNUff7L+m9XdBvQ7Z4W02WoZH/ev117s0rf3od17fvl2ldVsTkxLPeejrpw6efzM6YaAgC0/ry/OKwhnQTPAt04bZ15YbaqerC2tcyepT1GmyNoaj+LkNUqn+uqry8aNe+d8j1VUtCsz84f+/f/RQLHV4Z1jBZUOZ5+wOpaIkrlP84pvTGjZNUQWm/hduyXjttYxOi9WhCkuKPjpy6/P9/jjb7pRZzBcaHSn+/6Tz8pLSs+83elwCLdQaeqYZd9v2JDE1HOPq4QQqwpLL48KuywitCEirdt3380dMuRJnY7J7E2Bw2FavfreUaOWNN5T/F5auaqwbHRMi8Z7ikayvqSidYBhUpwslu56ZN/RDkEBrY3n/s7PZrV+8Wbdn3cqlcrpdNb5qwFXDPPMMmgQ237b4NmUzHtJbdP6Xj7Ym3v+UVZlUCtvTmrE3Z83b34xIWFATEzP833gihU3jRnzpkLRWN9tH6yufelw/tR4/xsC7qystblcc1Nk8cXqS5m5GqWyW+i5BzBut3v5a6+f7VdnK/imdurY89L0iw7zhHpaU3hUZFlRHSuVRcXGDhk32puDH6w2ZZut/2qXeNFhntWePcv0+rA2bc57g+Iff3yof/9njcb6pttAJvLz1+fkrOjTZ875PnDbtjciItolJAxsnLiEEOLmbQeubhUZWtfoWs6sLteSwwXLete35oPP7KqseTf7+KRWXg2Htv7ya+befWfeXk+aExIePvLaSQ0Rqah/FFRPvz3x5ps0Wq03x3/uUM6HvdprG23PocrKY9u2/Xfw4Ce8uC+aF6u1Yu3aR0aOfKnxnuLn4vJ1JZUjW4Y33lM0kp+Ky7sEG0fHnPvLex+Yt+fwJWFB8QbdOe9pNZu/fPu9On8l56ytTcf2l1x2qTf33FhWGa7RTEuMvtDozm3DhmfbtLkiKqrz+T7wq6+mjxtXxwIgfjZcgI+VF5V88/5Hdf4qIqZlScHxOn915ZTJDVhQ/uWb77IPZZ55ez2jnIQ2KV4WlAE0cw677Wy9XHxK8tmWGmzdLq0BhyZ7//jz+08+q/NXZ+tpB44c4WVBGfCls7WmFlGRpXUVc4UQQyeOa8CCcj2tybNExpm3t+vezcuCMgA0rF2bt6yta+5OPWlOXOukBiwo221nHQXV02+PnTbVy4IyADQUu83uj1nb4HGjvSwo+yMKyqhPWtfOH2yUeCXyRe+9KW0AAJqwgMAgyXu5G+6Zy0b2aAIUCgWtCQC8N+OB+2Y8cJ+EARiDpB8FAYA3AkOCJe+vGGeepomvoQwAAAAAAAAAaCgUlAEAAAAAAAAAXqGgDAAAAAAAAADwCgVlAAAAAAAAAIBXKCgDAAAAAAAAALxCQRkAAAAAAAAA4BUKygAAAAAAAAAAr1BQBgAAAAAAAAB4hYIyAAAAAAAAAMArFJQBAAAAAAAAAF7xs4JyoEqlViikjuJCBKmVGtlEnmDQuIRb6ihkwaBS6pR+1grQtGmVSqPKL/8mNUqFUS2XyFto1YJeTgjPx3ygSiV1FNJQK5VhGrXUUVwIhXCHyibyEI1aKZfxi8QUCkW4Vi7vC9AYWum1Lrf/fXo6Xa4Eg07qKE5QCRGobqYfu2eKM+j8NHkHzkmvVBr8M2vTKhUBsummIrUatx9+7jQGtUIZJJv3xUt+1gBiDbpcs1XqKC5ERrW5tVEvdRQnqJWq41a71FHIQmaNJTFALu8LIIRIDNAfqjFLHcWFyDFZW8kmnYvQagos9HJCCFFgtcUammkvl2w07KoySR3Fhci32KL1WqmjOCHGoMu32KSOQhbyLdaWOrm8L0BjMKpVhX7Y3o9bHRrZfPGVHGjYX+2XHz0N7rjV7nC5lRSU0UTFB+iz/DVrs8XKZjwTrlUX+OHnTmPINVvkk017yc8Kyr3Cg0utDpfUYZyvIyZLz9BAg2zmiA2OCDlaa5E6CukVWW2xBm2MbJJ2QAgRpdPEG3SF/vaVj8XpsrvcnYONUgdywoCI4HwLvZzwpHN9w4OljkIygyNDDvhhYl9osfcLD5I6ihP6hQcV8vWMEEKIAoutXzNuTWgOLosMzfHDuTtHTZZLI0KljuIEg0rVIyTwiIlBiDhsMg+KlMv7AjS4OIMuSqsp9resrcbhVClEW9lkbekRoXkUlIVwut0VdmfPMLmM/73kZwVljUJxf1rcp7lFfnQ1Vp7ZuqG08v60BKkD+cuwluHRes2aonKpA5FSidW+pqh8npzeF8BjXtuEn4rKiq1+88lqcbm+zC95sG2CfCahtA4wjIuN/DyvWOpApOQW4pPcovtS47Symbfle3elxO2orDlc60/zRz7JK56bEhuolsvSCoFq9dyU2E+ad2sSQnyeVzwlvmWsv80cAc5Ln7DgbqGB3xSWSR3IeVhTVB6j0w5vGSZ1IH+5Py1+Q2lVvh+W5hvQxrIql8s9sVWk1IEAjWheu4Q1xeVlNr+pKdc6nN8Ulj7YNkE+dcC0QMOIlmFf5pdIHYiUnG73p3nFD6TG+90aQXJJV7yXGhhwW3LsswdzInTaVgadU8aVZavTXWGzm92uZzunyO3vYnpizNIj+d8fL1MoFBFajUPGL2ODc7vd+RZbpd3xTOcU1jiDDBnVqqc6Js/bczhYo47SaVQy/lxRCJFntpba7PPbJ8ntCp30FiE2t/vL/BKtQhFr0NmbUy+nEiLPYiux2u5Ni2sXKJcJCFJ5pnPKI3uP7K82GVVqg2yW+T6TWijyLNYyu+OO5NiOspk24tEx2HhTYvSSw/mhGnWsvIdeDU6jUOSZrRaXc1xMZG9/mzYCXICxMRGVdsd3x8sUQrTUaWX76alWKEptdrfbHanXTE+Kljqcv1EqFM90Sn547xG9ShGu1Tarr3WVQlTYHE4hAtXK+1LjpQ4HaFzBavXj7ZMe2H04TKOOMejk3NYVCpFrspbZ7As7JkfpNFKH8zdDIsPsLvfKglKVQhFj0NpdMv3caQxKIfIttmKr7cG2CSlGg9ThnDf/KygLIdICA17pnvZHWVWhxVblcEodzlmF6tXxUSE9Q2WaftzaOvZgjelwraXC5rC4/G4dkQsXolantwjxu6sJ0KwEqlWvdEvdWl6da7ZW2B1Sh3NWIWpVr7Cg3mFB8pmbfKohEaFdggP3VNWUWB01Dvm+jA0uSKPuFhrYOzxYPpvBSkitUDzVKXl7RU22yVpul+8UkhCNukuI8ZLwIL0st4rtHGx8rnPy1vKaIpu9SsadUoMLUqs6Bhs7BhtaaFkgC83FjQnRR2otmbXmcpvD5JRpqmVQqZKN+uQAfVpQgNSx1EGvUj7fJeXP8uoceQ/kGpxGpUw2GhIDdHL7ZhRoJKEa9es90raWV+eYrZUybuwhGnWfsOBeYUHyzA2uaBneJSRwf7WpxGqrdTaj2lSQRtUjNLB3eLDfzU328MuCsmfCSP8WIVJH4ffSAgPSAuU4CAMghLgkLOgSvvm4OBFa9SDZrKsICXUPDeweGih1FP5Nr1KlRzD0ApqF1ka9fLYT9189w4KYwgI0B2RtFy9Gz+5W/keOs2AAAAAAAAAAADJEQRkAAAAAAAAA4BUKygAAAAAAAAAAr1BQBgAAAAAAAAB4hYIyAAAAAAAAAMArFJQBAAAAAAAAAF6hoAwAAAAAAAAA8AoFZQAAAAAAAACAVygoAwAAAAAAAAC8QkEZAAAAAAAAAOAVCsoAAAAAAAAAAK9QUAYAAAAAAAAAeIWCMgAAAAAAAADAKxSUAQAAAAAAAABeoaAMAAAAAAAAAPAKBWUAAAAAAAAAgFcoKAMAAAAAAAAAvEJBGQAAAAAAAADgFQrKAAAAAAAAAACvUFAGAAAAAAAAAHhF3dAHdLvdzoY+JgDICL0c8HfuC3+k23kxDweaGLfbJXUIkDW328UgBDjJ7WYIgfrQYQJ/18B9ZkMWlJVKTUlJxsqVsxrwmEAToFAopA4BDUXpctno5YDTJCUNvoBH6XTB33xzayOEA/ixLl2ulzoEyJRabdi5892dO9+VOhBARlq0aCt1CJAnpd1eS9YGnCYtbVQDHq0hC8oREe3HjHmrAQ8IALKiVuvp5YCGMmLE/0kdAgD4jS5druf7BgDwhk4XTNYGNDbWUAYAAAAAAAAAeIWCMgAAAAAAAADAKxSUAQAAAAAAAABeoaAMAAAAAAAAAPAKBWUAAAAAAAAAgFcoKAMAAAAAAAAAvEJBGQAAAAAAAADgFQrKAAAAAAAAAACvUFAGAAAAAAAAAHiFgjIAAAAAAAAAwCsUlAEAAAAAAAAAXqGgDAAAAAAAAADwCgVlAAAAAAAAAIBXKCgDAAAAAAAAALxCQRkAAAAAAAAA4BUKygAAAAAAAAAAr1BQBgAAAAAAAAB4hYIyAAAAAAAAAMArFJQBAAAAAAAAAF6hoAwAAAAAAAAA8AoFZQAAAAAAAACAV9R13up0WrdsedXnwQAAAAA4PyZTkdQh4DyUlh4g1QIAAH5CUfetbrf7zFvz89c1fkAAAAAAGkDLln1VKr3UUeDcLJbSsrLdUkcBAADgFaVSEx094Mzb6y4oAwAAAAAAAABwGtZQBgAAAAAAAAB4hYIyAAAAAAAAAMArFJQBAAAAAAAAAF6hoAwAAAAAAAAA8AoFZQAAAAAAAACAV/4f8r8yY7tTTw0AAAAASUVORK5CYII=",
+ "text/plain": [
+ ""
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ }
+ ],
+ "source": [
+ "# Visualize graph\n",
+ "visualize_graph(super_graph, xray=True)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 29,
+ "metadata": {
+ "id": "nFts6LksM2H2"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mSearcher\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: To create a report in Arxiv paper format on the topic of \"How to Perform Complex Tasks Using Multi-Agent Architecture,\" we will follow a structured approach. Here's an outline for the report:\n",
+ "\n",
+ "1. **Introduction**\n",
+ "2. **Background and Related Work**\n",
+ "3. **Multi-Agent Architecture Overview**\n",
+ "4. **Techniques for Task Decomposition**\n",
+ "5. **Coordination and Communication Among Agents**\n",
+ "6. **Applications of Multi-Agent Systems**\n",
+ "7. **Challenges and Future Directions**\n",
+ "8. **Conclusion**\n",
+ "9. **References**\n",
+ "\n",
+ "Let's proceed with writing detailed content for each section.\n",
+ "\n",
+ "---\n",
+ "\n",
+ "# How to Perform Complex Tasks Using Multi-Agent Architecture\n",
+ "\n",
+ "## 1. Introduction\n",
+ "\n",
+ "The introduction of multi-agent systems (MAS) has revolutionized the way complex tasks are approached in computational environments. These systems consist of multiple interacting agents, each capable of autonomous action to achieve specific goals. The primary advantage of MAS lies in their ability to handle tasks that are too complex for a single agent or system to manage. This paper explores the methodologies and architectures that enable multi-agent systems to perform complex tasks efficiently. We will delve into the principles of task decomposition, coordination, and communication that are essential for the successful implementation of MAS.\n",
+ "\n",
+ "## 2. Background and Related Work\n",
+ "\n",
+ "The concept of multi-agent systems has its roots in distributed artificial intelligence (DAI), which emerged in the 1980s. Early research focused on the development of autonomous agents capable of decision-making and problem-solving. Over the years, the field has expanded to include various architectures and frameworks that facilitate agent interaction. Notable contributions include the Contract Net Protocol, which outlines a negotiation-based approach for task allocation among agents. Recent advancements have seen the integration of machine learning techniques to enhance agent capabilities and adaptability.\n",
+ "\n",
+ "## 3. Multi-Agent Architecture Overview\n",
+ "\n",
+ "Multi-agent architecture refers to the structural design that defines how agents are organized and interact within a system. There are several types of architectures, including hierarchical, flat, and hybrid models. Hierarchical architectures involve a top-down approach where a central agent coordinates the activities of subordinate agents. In contrast, flat architectures promote equal status among agents, allowing for decentralized decision-making. Hybrid models combine elements of both, offering flexibility in task execution. The choice of architecture depends on the complexity of the task and the desired level of agent autonomy.\n",
+ "\n",
+ "## 4. Techniques for Task Decomposition\n",
+ "\n",
+ "Task decomposition is a critical step in enabling multi-agent systems to handle complex tasks. It involves breaking down a large task into smaller, manageable sub-tasks that can be distributed among agents. Techniques such as goal-oriented decomposition and role-based allocation are commonly used. Goal-oriented decomposition focuses on dividing tasks based on the desired outcomes, while role-based allocation assigns tasks based on the specific capabilities of each agent. Effective task decomposition ensures that agents can work concurrently, reducing the overall time required for task completion.\n",
+ "\n",
+ "## 5. Coordination and Communication Among Agents\n",
+ "\n",
+ "Coordination and communication are vital components of multi-agent systems, ensuring that agents work harmoniously towards a common goal. Coordination involves the synchronization of agent activities to prevent conflicts and redundancies. Communication protocols, such as message passing and blackboard systems, facilitate information exchange among agents. These protocols enable agents to share knowledge, negotiate roles, and make collective decisions. The efficiency of coordination and communication mechanisms directly impacts the performance of the multi-agent system.\n",
+ "\n",
+ "## 6. Applications of Multi-Agent Systems\n",
+ "\n",
+ "Multi-agent systems have found applications in various domains, including robotics, logistics, and healthcare. In robotics, MAS are used to coordinate the actions of multiple robots in tasks such as search and rescue operations. In logistics, they optimize supply chain management by coordinating the movement of goods and resources. In healthcare, MAS assist in patient monitoring and personalized treatment planning. The versatility of multi-agent systems makes them suitable for any scenario that requires distributed problem-solving and decision-making.\n",
+ "\n",
+ "## 7. Challenges and Future Directions\n",
+ "\n",
+ "Despite their advantages, multi-agent systems face several challenges, including scalability, security, and interoperability. As the number of agents increases, so does the complexity of coordination and communication. Ensuring the security of agent interactions is crucial, especially in sensitive applications. Interoperability between different MAS frameworks remains a significant hurdle. Future research is focused on addressing these challenges through the development of more robust architectures and protocols. The integration of advanced AI techniques, such as deep learning, holds promise for enhancing agent intelligence and adaptability.\n",
+ "\n",
+ "## 8. Conclusion\n",
+ "\n",
+ "Multi-agent systems offer a powerful approach to tackling complex tasks by leveraging the collective capabilities of autonomous agents. Through effective task decomposition, coordination, and communication, MAS can achieve results that are beyond the reach of individual agents. While challenges remain, ongoing research and technological advancements continue to push the boundaries of what is possible with multi-agent systems. As these systems evolve, they will undoubtedly play an increasingly important role in various fields, driving innovation and efficiency.\n",
+ "\n",
+ "## 9. References\n",
+ "\n",
+ "- Jennings, N. R., & Wooldridge, M. (1998). Applications of intelligent agents. In Agent technology (pp. 3-28). Springer, Berlin, Heidelberg.\n",
+ "- Weiss, G. (Ed.). (1999). Multiagent systems: a modern approach to distributed artificial intelligence. MIT press.\n",
+ "- Sandholm, T. (1999). Distributed rational decision making. In Multiagent systems (pp. 201-258). MIT press.\n",
+ "- Stone, P., & Veloso, M. (2000). Multiagent systems: A survey from a machine learning perspective. Autonomous Robots, 8(3), 345-383.\n",
+ "- Wooldridge, M. (2009). An introduction to multiagent systems. John Wiley & Sons.\n",
+ "\n",
+ "---\n",
+ "\n",
+ "This content will be saved as a Markdown (.md) file.\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSearcher\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "user: To create a report in Arxiv paper format on the topic of \"How to Perform Complex Tasks Using Multi-Agent Architecture,\" we will follow a structured approach. Here's an outline for the report:\n",
+ "\n",
+ "1. **Introduction**\n",
+ "2. **Background and Related Work**\n",
+ "3. **Multi-Agent Architecture Overview**\n",
+ "4. **Techniques for Task Decomposition**\n",
+ "5. **Coordination and Communication Among Agents**\n",
+ "6. **Applications of Multi-Agent Systems**\n",
+ "7. **Challenges and Future Directions**\n",
+ "8. **Conclusion**\n",
+ "9. **References**\n",
+ "\n",
+ "Let's proceed with writing detailed content for each section.\n",
+ "\n",
+ "---\n",
+ "\n",
+ "# How to Perform Complex Tasks Using Multi-Agent Architecture\n",
+ "\n",
+ "## 1. Introduction\n",
+ "\n",
+ "The introduction of multi-agent systems (MAS) has revolutionized the way complex tasks are approached in computational environments. These systems consist of multiple interacting agents, each capable of autonomous action to achieve specific goals. The primary advantage of MAS lies in their ability to handle tasks that are too complex for a single agent or system to manage. This paper explores the methodologies and architectures that enable multi-agent systems to perform complex tasks efficiently. We will delve into the principles of task decomposition, coordination, and communication that are essential for the successful implementation of MAS.\n",
+ "\n",
+ "## 2. Background and Related Work\n",
+ "\n",
+ "The concept of multi-agent systems has its roots in distributed artificial intelligence (DAI), which emerged in the 1980s. Early research focused on the development of autonomous agents capable of decision-making and problem-solving. Over the years, the field has expanded to include various architectures and frameworks that facilitate agent interaction. Notable contributions include the Contract Net Protocol, which outlines a negotiation-based approach for task allocation among agents. Recent advancements have seen the integration of machine learning techniques to enhance agent capabilities and adaptability.\n",
+ "\n",
+ "## 3. Multi-Agent Architecture Overview\n",
+ "\n",
+ "Multi-agent architecture refers to the structural design that defines how agents are organized and interact within a system. There are several types of architectures, including hierarchical, flat, and hybrid models. Hierarchical architectures involve a top-down approach where a central agent coordinates the activities of subordinate agents. In contrast, flat architectures promote equal status among agents, allowing for decentralized decision-making. Hybrid models combine elements of both, offering flexibility in task execution. The choice of architecture depends on the complexity of the task and the desired level of agent autonomy.\n",
+ "\n",
+ "## 4. Techniques for Task Decomposition\n",
+ "\n",
+ "Task decomposition is a critical step in enabling multi-agent systems to handle complex tasks. It involves breaking down a large task into smaller, manageable sub-tasks that can be distributed among agents. Techniques such as goal-oriented decomposition and role-based allocation are commonly used. Goal-oriented decomposition focuses on dividing tasks based on the desired outcomes, while role-based allocation assigns tasks based on the specific capabilities of each agent. Effective task decomposition ensures that agents can work concurrently, reducing the overall time required for task completion.\n",
+ "\n",
+ "## 5. Coordination and Communication Among Agents\n",
+ "\n",
+ "Coordination and communication are vital components of multi-agent systems, ensuring that agents work harmoniously towards a common goal. Coordination involves the synchronization of agent activities to prevent conflicts and redundancies. Communication protocols, such as message passing and blackboard systems, facilitate information exchange among agents. These protocols enable agents to share knowledge, negotiate roles, and make collective decisions. The efficiency of coordination and communication mechanisms directly impacts the performance of the multi-agent system.\n",
+ "\n",
+ "## 6. Applications of Multi-Agent Systems\n",
+ "\n",
+ "Multi-agent systems have found applications in various domains, including robotics, logistics, and healthcare. In robotics, MAS are used to coordinate the actions of multiple robots in tasks such as search and rescue operations. In logistics, they optimize supply chain management by coordinating the movement of goods and resources. In healthcare, MAS assist in patient monitoring and personalized treatment planning. The versatility of multi-agent systems makes them suitable for any scenario that requires distributed problem-solving and decision-making.\n",
+ "\n",
+ "## 7. Challenges and Future Directions\n",
+ "\n",
+ "Despite their advantages, multi-agent systems face several challenges, including scalability, security, and interoperability. As the number of agents increases, so does the complexity of coordination and communication. Ensuring the security of agent interactions is crucial, especially in sensitive applications. Interoperability between different MAS frameworks remains a significant hurdle. Future research is focused on addressing these challenges through the development of more robust architectures and protocols. The integration of advanced AI techniques, such as deep learning, holds promise for enhancing agent intelligence and adaptability.\n",
+ "\n",
+ "## 8. Conclusion\n",
+ "\n",
+ "Multi-agent systems offer a powerful approach to tackling complex tasks by leveraging the collective capabilities of autonomous agents. Through effective task decomposition, coordination, and communication, MAS can achieve results that are beyond the reach of individual agents. While challenges remain, ongoing research and technological advancements continue to push the boundaries of what is possible with multi-agent systems. As these systems evolve, they will undoubtedly play an increasingly important role in various fields, driving innovation and efficiency.\n",
+ "\n",
+ "## 9. References\n",
+ "\n",
+ "- Jennings, N. R., & Wooldridge, M. (1998). Applications of intelligent agents. In Agent technology (pp. 3-28). Springer, Berlin, Heidelberg.\n",
+ "- Weiss, G. (Ed.). (1999). Multiagent systems: a modern approach to distributed artificial intelligence. MIT press.\n",
+ "- Sandholm, T. (1999). Distributed rational decision making. In Multiagent systems (pp. 201-258). MIT press.\n",
+ "- Stone, P., & Veloso, M. (2000). Multiagent systems: A survey from a machine learning perspective. Autonomous Robots, 8(3), 345-383.\n",
+ "- Wooldridge, M. (2009). An introduction to multiagent systems. John Wiley & Sons.\n",
+ "\n",
+ "---\n",
+ "\n",
+ "This content will be saved as a Markdown (.md) file.\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mResearchTeam\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "user: To create a report in Arxiv paper format on the topic of \"How to Perform Complex Tasks Using Multi-Agent Architecture,\" we will follow a structured approach. Here's an outline for the report:\n",
+ "\n",
+ "1. **Introduction**\n",
+ "2. **Background and Related Work**\n",
+ "3. **Multi-Agent Architecture Overview**\n",
+ "4. **Techniques for Task Decomposition**\n",
+ "5. **Coordination and Communication Among Agents**\n",
+ "6. **Applications of Multi-Agent Systems**\n",
+ "7. **Challenges and Future Directions**\n",
+ "8. **Conclusion**\n",
+ "9. **References**\n",
+ "\n",
+ "Let's proceed with writing detailed content for each section.\n",
+ "\n",
+ "---\n",
+ "\n",
+ "# How to Perform Complex Tasks Using Multi-Agent Architecture\n",
+ "\n",
+ "## 1. Introduction\n",
+ "\n",
+ "The introduction of multi-agent systems (MAS) has revolutionized the way complex tasks are approached in computational environments. These systems consist of multiple interacting agents, each capable of autonomous action to achieve specific goals. The primary advantage of MAS lies in their ability to handle tasks that are too complex for a single agent or system to manage. This paper explores the methodologies and architectures that enable multi-agent systems to perform complex tasks efficiently. We will delve into the principles of task decomposition, coordination, and communication that are essential for the successful implementation of MAS.\n",
+ "\n",
+ "## 2. Background and Related Work\n",
+ "\n",
+ "The concept of multi-agent systems has its roots in distributed artificial intelligence (DAI), which emerged in the 1980s. Early research focused on the development of autonomous agents capable of decision-making and problem-solving. Over the years, the field has expanded to include various architectures and frameworks that facilitate agent interaction. Notable contributions include the Contract Net Protocol, which outlines a negotiation-based approach for task allocation among agents. Recent advancements have seen the integration of machine learning techniques to enhance agent capabilities and adaptability.\n",
+ "\n",
+ "## 3. Multi-Agent Architecture Overview\n",
+ "\n",
+ "Multi-agent architecture refers to the structural design that defines how agents are organized and interact within a system. There are several types of architectures, including hierarchical, flat, and hybrid models. Hierarchical architectures involve a top-down approach where a central agent coordinates the activities of subordinate agents. In contrast, flat architectures promote equal status among agents, allowing for decentralized decision-making. Hybrid models combine elements of both, offering flexibility in task execution. The choice of architecture depends on the complexity of the task and the desired level of agent autonomy.\n",
+ "\n",
+ "## 4. Techniques for Task Decomposition\n",
+ "\n",
+ "Task decomposition is a critical step in enabling multi-agent systems to handle complex tasks. It involves breaking down a large task into smaller, manageable sub-tasks that can be distributed among agents. Techniques such as goal-oriented decomposition and role-based allocation are commonly used. Goal-oriented decomposition focuses on dividing tasks based on the desired outcomes, while role-based allocation assigns tasks based on the specific capabilities of each agent. Effective task decomposition ensures that agents can work concurrently, reducing the overall time required for task completion.\n",
+ "\n",
+ "## 5. Coordination and Communication Among Agents\n",
+ "\n",
+ "Coordination and communication are vital components of multi-agent systems, ensuring that agents work harmoniously towards a common goal. Coordination involves the synchronization of agent activities to prevent conflicts and redundancies. Communication protocols, such as message passing and blackboard systems, facilitate information exchange among agents. These protocols enable agents to share knowledge, negotiate roles, and make collective decisions. The efficiency of coordination and communication mechanisms directly impacts the performance of the multi-agent system.\n",
+ "\n",
+ "## 6. Applications of Multi-Agent Systems\n",
+ "\n",
+ "Multi-agent systems have found applications in various domains, including robotics, logistics, and healthcare. In robotics, MAS are used to coordinate the actions of multiple robots in tasks such as search and rescue operations. In logistics, they optimize supply chain management by coordinating the movement of goods and resources. In healthcare, MAS assist in patient monitoring and personalized treatment planning. The versatility of multi-agent systems makes them suitable for any scenario that requires distributed problem-solving and decision-making.\n",
+ "\n",
+ "## 7. Challenges and Future Directions\n",
+ "\n",
+ "Despite their advantages, multi-agent systems face several challenges, including scalability, security, and interoperability. As the number of agents increases, so does the complexity of coordination and communication. Ensuring the security of agent interactions is crucial, especially in sensitive applications. Interoperability between different MAS frameworks remains a significant hurdle. Future research is focused on addressing these challenges through the development of more robust architectures and protocols. The integration of advanced AI techniques, such as deep learning, holds promise for enhancing agent intelligence and adaptability.\n",
+ "\n",
+ "## 8. Conclusion\n",
+ "\n",
+ "Multi-agent systems offer a powerful approach to tackling complex tasks by leveraging the collective capabilities of autonomous agents. Through effective task decomposition, coordination, and communication, MAS can achieve results that are beyond the reach of individual agents. While challenges remain, ongoing research and technological advancements continue to push the boundaries of what is possible with multi-agent systems. As these systems evolve, they will undoubtedly play an increasingly important role in various fields, driving innovation and efficiency.\n",
+ "\n",
+ "## 9. References\n",
+ "\n",
+ "- Jennings, N. R., & Wooldridge, M. (1998). Applications of intelligent agents. In Agent technology (pp. 3-28). Springer, Berlin, Heidelberg.\n",
+ "- Weiss, G. (Ed.). (1999). Multiagent systems: a modern approach to distributed artificial intelligence. MIT press.\n",
+ "- Sandholm, T. (1999). Distributed rational decision making. In Multiagent systems (pp. 201-258). MIT press.\n",
+ "- Stone, P., & Veloso, M. (2000). Multiagent systems: A survey from a machine learning perspective. Autonomous Robots, 8(3), 345-383.\n",
+ "- Wooldridge, M. (2009). An introduction to multiagent systems. John Wiley & Sons.\n",
+ "\n",
+ "---\n",
+ "\n",
+ "This content will be saved as a Markdown (.md) file.\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mResearchTeam\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "To create a report in Arxiv paper format on the topic of \"How to Perform Complex Tasks Using Multi-Agent Architecture,\" we will follow a structured approach. Here's an outline for the report:\n",
+ "\n",
+ "1. **Introduction**\n",
+ "2. **Background and Related Work**\n",
+ "3. **Multi-Agent Architecture Overview**\n",
+ "4. **Techniques for Task Decomposition**\n",
+ "5. **Coordination and Communication Among Agents**\n",
+ "6. **Applications of Multi-Agent Systems**\n",
+ "7. **Challenges and Future Directions**\n",
+ "8. **Conclusion**\n",
+ "9. **References**\n",
+ "\n",
+ "Let's proceed with writing detailed content for each section.\n",
+ "\n",
+ "---\n",
+ "\n",
+ "# How to Perform Complex Tasks Using Multi-Agent Architecture\n",
+ "\n",
+ "## 1. Introduction\n",
+ "\n",
+ "The introduction of multi-agent systems (MAS) has revolutionized the way complex tasks are approached in computational environments. These systems consist of multiple interacting agents, each capable of autonomous action to achieve specific goals. The primary advantage of MAS lies in their ability to handle tasks that are too complex for a single agent or system to manage. This paper explores the methodologies and architectures that enable multi-agent systems to perform complex tasks efficiently. We will delve into the principles of task decomposition, coordination, and communication that are essential for the successful implementation of MAS.\n",
+ "\n",
+ "## 2. Background and Related Work\n",
+ "\n",
+ "The concept of multi-agent systems has its roots in distributed artificial intelligence (DAI), which emerged in the 1980s. Early research focused on the development of autonomous agents capable of decision-making and problem-solving. Over the years, the field has expanded to include various architectures and frameworks that facilitate agent interaction. Notable contributions include the Contract Net Protocol, which outlines a negotiation-based approach for task allocation among agents. Recent advancements have seen the integration of machine learning techniques to enhance agent capabilities and adaptability.\n",
+ "\n",
+ "## 3. Multi-Agent Architecture Overview\n",
+ "\n",
+ "Multi-agent architecture refers to the structural design that defines how agents are organized and interact within a system. There are several types of architectures, including hierarchical, flat, and hybrid models. Hierarchical architectures involve a top-down approach where a central agent coordinates the activities of subordinate agents. In contrast, flat architectures promote equal status among agents, allowing for decentralized decision-making. Hybrid models combine elements of both, offering flexibility in task execution. The choice of architecture depends on the complexity of the task and the desired level of agent autonomy.\n",
+ "\n",
+ "## 4. Techniques for Task Decomposition\n",
+ "\n",
+ "Task decomposition is a critical step in enabling multi-agent systems to handle complex tasks. It involves breaking down a large task into smaller, manageable sub-tasks that can be distributed among agents. Techniques such as goal-oriented decomposition and role-based allocation are commonly used. Goal-oriented decomposition focuses on dividing tasks based on the desired outcomes, while role-based allocation assigns tasks based on the specific capabilities of each agent. Effective task decomposition ensures that agents can work concurrently, reducing the overall time required for task completion.\n",
+ "\n",
+ "## 5. Coordination and Communication Among Agents\n",
+ "\n",
+ "Coordination and communication are vital components of multi-agent systems, ensuring that agents work harmoniously towards a common goal. Coordination involves the synchronization of agent activities to prevent conflicts and redundancies. Communication protocols, such as message passing and blackboard systems, facilitate information exchange among agents. These protocols enable agents to share knowledge, negotiate roles, and make collective decisions. The efficiency of coordination and communication mechanisms directly impacts the performance of the multi-agent system.\n",
+ "\n",
+ "## 6. Applications of Multi-Agent Systems\n",
+ "\n",
+ "Multi-agent systems have found applications in various domains, including robotics, logistics, and healthcare. In robotics, MAS are used to coordinate the actions of multiple robots in tasks such as search and rescue operations. In logistics, they optimize supply chain management by coordinating the movement of goods and resources. In healthcare, MAS assist in patient monitoring and personalized treatment planning. The versatility of multi-agent systems makes them suitable for any scenario that requires distributed problem-solving and decision-making.\n",
+ "\n",
+ "## 7. Challenges and Future Directions\n",
+ "\n",
+ "Despite their advantages, multi-agent systems face several challenges, including scalability, security, and interoperability. As the number of agents increases, so does the complexity of coordination and communication. Ensuring the security of agent interactions is crucial, especially in sensitive applications. Interoperability between different MAS frameworks remains a significant hurdle. Future research is focused on addressing these challenges through the development of more robust architectures and protocols. The integration of advanced AI techniques, such as deep learning, holds promise for enhancing agent intelligence and adaptability.\n",
+ "\n",
+ "## 8. Conclusion\n",
+ "\n",
+ "Multi-agent systems offer a powerful approach to tackling complex tasks by leveraging the collective capabilities of autonomous agents. Through effective task decomposition, coordination, and communication, MAS can achieve results that are beyond the reach of individual agents. While challenges remain, ongoing research and technological advancements continue to push the boundaries of what is possible with multi-agent systems. As these systems evolve, they will undoubtedly play an increasingly important role in various fields, driving innovation and efficiency.\n",
+ "\n",
+ "## 9. References\n",
+ "\n",
+ "- Jennings, N. R., & Wooldridge, M. (1998). Applications of intelligent agents. In Agent technology (pp. 3-28). Springer, Berlin, Heidelberg.\n",
+ "- Weiss, G. (Ed.). (1999). Multiagent systems: a modern approach to distributed artificial intelligence. MIT press.\n",
+ "- Sandholm, T. (1999). Distributed rational decision making. In Multiagent systems (pp. 201-258). MIT press.\n",
+ "- Stone, P., & Veloso, M. (2000). Multiagent systems: A survey from a machine learning perspective. Autonomous Robots, 8(3), 345-383.\n",
+ "- Wooldridge, M. (2009). An introduction to multiagent systems. John Wiley & Sons.\n",
+ "\n",
+ "---\n",
+ "\n",
+ "This content will be saved as a Markdown (.md) file.\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mPaperWritingTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: Document saved to Multi_Agent_Architecture_Report.md\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "assistant: The report titled \"How to Perform Complex Tasks Using Multi-Agent Architecture\" has been successfully created and saved as a Markdown file named \"Multi_Agent_Architecture_Report.md\". If you need further modifications or additional content, please let me know!\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mDocWriter\u001b[0m in [\u001b[1;33mPaperWritingTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "user: The report titled \"How to Perform Complex Tasks Using Multi-Agent Architecture\" has been successfully created and saved as a Markdown file named \"Multi_Agent_Architecture_Report.md\". If you need further modifications or additional content, please let me know!\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mPaperWritingTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mPaperWritingTeam\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\n",
+ "user: The report titled \"How to Perform Complex Tasks Using Multi-Agent Architecture\" has been successfully created and saved as a Markdown file named \"Multi_Agent_Architecture_Report.md\". If you need further modifications or additional content, please let me know!\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mResearchTeam\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "The report titled \"How to Perform Complex Tasks Using Multi-Agent Architecture\" has been successfully created and saved as a Markdown file named \"Multi_Agent_Architecture_Report.md\". If you need further modifications or additional content, please let me know!\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================================\n"
+ ]
+ }
+ ],
+ "source": [
+ "output = run_graph(\n",
+ " super_graph,\n",
+ " \"\"\"Topic: How to perform complex tasks using multi-agent architecture\n",
+ "\n",
+ "Detailed guidelines:\n",
+ "- Generate a report in Arxiv paper format on the topic\n",
+ "- Create an outline\n",
+ "- Write at least 5 sentences for each outline section\n",
+ "- Create and add charts if needed when writing detailed content\n",
+ "- Write references in APA format\n",
+ "- Save final result as .md file\"\"\",\n",
+ " recursive_limit=150,\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "7QqT2jHdM5rC"
+ },
+ "source": [
+ "Display the final result in Markdown format:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 36,
+ "metadata": {
+ "id": "ceNsQ57_M6YC"
+ },
+ "outputs": [
+ {
+ "data": {
+ "text/markdown": [
+ "The report titled \"How to Perform Complex Tasks Using Multi-Agent Architecture\" has been successfully created and saved as a Markdown file named \"Multi_Agent_Architecture_Report.md\". If you need further modifications or additional content, please let me know!"
+ ],
+ "text/plain": [
+ ""
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ }
+ ],
+ "source": [
+ "from IPython.display import Markdown\n",
+ "\n",
+ "if hasattr(output[\"messages\"][-1], \"content\"):\n",
+ " display(Markdown(output[\"messages\"][-1].content))\n",
+ "else:\n",
+ " display(Markdown(output[\"messages\"][-1]))"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "If you haven't seen the final result in Markdown format with the code above, please run the following code to check the results."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 39,
+ "metadata": {},
+ "outputs": [
+ {
+ "data": {
+ "text/markdown": [
+ "# How to Perform Complex Tasks Using Multi-Agent Architecture\n",
+ "\n",
+ "## 1. Introduction\n",
+ "\n",
+ "The introduction of multi-agent systems (MAS) has revolutionized the way complex tasks are approached in computational environments. These systems consist of multiple interacting agents, each capable of autonomous action to achieve specific goals. The primary advantage of MAS lies in their ability to handle tasks that are too complex for a single agent or system to manage. This paper explores the methodologies and architectures that enable multi-agent systems to perform complex tasks efficiently. We will delve into the principles of task decomposition, coordination, and communication that are essential for the successful implementation of MAS.\n",
+ "\n",
+ "## 2. Background and Related Work\n",
+ "\n",
+ "The concept of multi-agent systems has its roots in distributed artificial intelligence (DAI), which emerged in the 1980s. Early research focused on the development of autonomous agents capable of decision-making and problem-solving. Over the years, the field has expanded to include various architectures and frameworks that facilitate agent interaction. Notable contributions include the Contract Net Protocol, which outlines a negotiation-based approach for task allocation among agents. Recent advancements have seen the integration of machine learning techniques to enhance agent capabilities and adaptability.\n",
+ "\n",
+ "## 3. Multi-Agent Architecture Overview\n",
+ "\n",
+ "Multi-agent architecture refers to the structural design that defines how agents are organized and interact within a system. There are several types of architectures, including hierarchical, flat, and hybrid models. Hierarchical architectures involve a top-down approach where a central agent coordinates the activities of subordinate agents. In contrast, flat architectures promote equal status among agents, allowing for decentralized decision-making. Hybrid models combine elements of both, offering flexibility in task execution. The choice of architecture depends on the complexity of the task and the desired level of agent autonomy.\n",
+ "\n",
+ "## 4. Techniques for Task Decomposition\n",
+ "\n",
+ "Task decomposition is a critical step in enabling multi-agent systems to handle complex tasks. It involves breaking down a large task into smaller, manageable sub-tasks that can be distributed among agents. Techniques such as goal-oriented decomposition and role-based allocation are commonly used. Goal-oriented decomposition focuses on dividing tasks based on the desired outcomes, while role-based allocation assigns tasks based on the specific capabilities of each agent. Effective task decomposition ensures that agents can work concurrently, reducing the overall time required for task completion.\n",
+ "\n",
+ "## 5. Coordination and Communication Among Agents\n",
+ "\n",
+ "Coordination and communication are vital components of multi-agent systems, ensuring that agents work harmoniously towards a common goal. Coordination involves the synchronization of agent activities to prevent conflicts and redundancies. Communication protocols, such as message passing and blackboard systems, facilitate information exchange among agents. These protocols enable agents to share knowledge, negotiate roles, and make collective decisions. The efficiency of coordination and communication mechanisms directly impacts the performance of the multi-agent system.\n",
+ "\n",
+ "## 6. Applications of Multi-Agent Systems\n",
+ "\n",
+ "Multi-agent systems have found applications in various domains, including robotics, logistics, and healthcare. In robotics, MAS are used to coordinate the actions of multiple robots in tasks such as search and rescue operations. In logistics, they optimize supply chain management by coordinating the movement of goods and resources. In healthcare, MAS assist in patient monitoring and personalized treatment planning. The versatility of multi-agent systems makes them suitable for any scenario that requires distributed problem-solving and decision-making.\n",
+ "\n",
+ "## 7. Challenges and Future Directions\n",
+ "\n",
+ "Despite their advantages, multi-agent systems face several challenges, including scalability, security, and interoperability. As the number of agents increases, so does the complexity of coordination and communication. Ensuring the security of agent interactions is crucial, especially in sensitive applications. Interoperability between different MAS frameworks remains a significant hurdle. Future research is focused on addressing these challenges through the development of more robust architectures and protocols. The integration of advanced AI techniques, such as deep learning, holds promise for enhancing agent intelligence and adaptability.\n",
+ "\n",
+ "## 8. Conclusion\n",
+ "\n",
+ "Multi-agent systems offer a powerful approach to tackling complex tasks by leveraging the collective capabilities of autonomous agents. Through effective task decomposition, coordination, and communication, MAS can achieve results that are beyond the reach of individual agents. While challenges remain, ongoing research and technological advancements continue to push the boundaries of what is possible with multi-agent systems. As these systems evolve, they will undoubtedly play an increasingly important role in various fields, driving innovation and efficiency.\n",
+ "\n",
+ "## 9. References\n",
+ "\n",
+ "- Jennings, N. R., & Wooldridge, M. (1998). Applications of intelligent agents. In Agent technology (pp. 3-28). Springer, Berlin, Heidelberg.\n",
+ "- Weiss, G. (Ed.). (1999). Multiagent systems: a modern approach to distributed artificial intelligence. MIT press.\n",
+ "- Sandholm, T. (1999). Distributed rational decision making. In Multiagent systems (pp. 201-258). MIT press.\n",
+ "- Stone, P., & Veloso, M. (2000). Multiagent systems: A survey from a machine learning perspective. Autonomous Robots, 8(3), 345-383.\n",
+ "- Wooldridge, M. (2009). An introduction to multiagent systems. John Wiley & Sons."
+ ],
+ "text/plain": [
+ ""
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ }
+ ],
+ "source": [
+ "import os\n",
+ "import glob\n",
+ "\n",
+ "md_files = glob.glob(\"tmp/*.md\")\n",
+ "latest_md = max(md_files, key=os.path.getmtime)\n",
+ "with open(latest_md, \"r\", encoding=\"utf-8\") as f:\n",
+ " content = f.read()\n",
+ " display(Markdown(content))"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": []
+ }
+ ],
+ "metadata": {
+ "colab": {
+ "provenance": [],
+ "toc_visible": true
+ },
+ "kernelspec": {
+ "display_name": "langchain-opentutorial-ZA2wmMtu-py3.11",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.11.11"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 0
+}
From 1d85d09a6393f652aad6d47137b086409abb7f35 Mon Sep 17 00:00:00 2001
From: Harheem Kim <49297157+harheem@users.noreply.github.com>
Date: Sat, 25 Jan 2025 11:30:12 +0900
Subject: [PATCH 2/6] Update 08-Hierarchical-Multi-Agent-Teams.ipynb
---
.../08-Hierarchical-Multi-Agent-Teams.ipynb | 3179 +++++++++++++----
1 file changed, 2527 insertions(+), 652 deletions(-)
diff --git a/17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb b/17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb
index 0688920a7..354dc7305 100644
--- a/17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb
+++ b/17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb
@@ -66,19 +66,19 @@
},
{
"cell_type": "code",
- "execution_count": 1,
+ "execution_count": 16,
"metadata": {
"id": "dFiZS1vM3GB_"
},
"outputs": [],
"source": [
"%%capture --no-stderr\n",
- "%pip install langchain-opentutorial"
+ "%pip install -U langchain-opentutorial"
]
},
{
"cell_type": "code",
- "execution_count": 2,
+ "execution_count": 17,
"metadata": {
"id": "X5qTujC_3Hbs"
},
@@ -94,6 +94,7 @@
" \"langchain_core\",\n",
" \"langchain_experimental\",\n",
" \"langgraph\",\n",
+ " \"beautifulsoup4\",\n",
" ],\n",
" verbose=False,\n",
" upgrade=False,\n",
@@ -115,7 +116,7 @@
},
{
"cell_type": "code",
- "execution_count": 3,
+ "execution_count": 18,
"metadata": {
"id": "cgmm0kl61s0l"
},
@@ -157,11 +158,22 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 19,
"metadata": {
"id": "ML63b6Tu5QUO"
},
- "outputs": [],
+ "outputs": [
+ {
+ "data": {
+ "text/plain": [
+ "False"
+ ]
+ },
+ "execution_count": 19,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
"source": [
"from dotenv import load_dotenv\n",
"\n",
@@ -187,19 +199,11 @@
},
{
"cell_type": "code",
- "execution_count": 4,
+ "execution_count": 20,
"metadata": {
"id": "4vT-Mlb45XxH"
},
- "outputs": [
- {
- "name": "stderr",
- "output_type": "stream",
- "text": [
- "USER_AGENT environment variable not set, consider setting it to identify your requests.\n"
- ]
- }
- ],
+ "outputs": [],
"source": [
"from typing import List\n",
"from langchain_community.document_loaders import WebBaseLoader\n",
@@ -247,7 +251,7 @@
},
{
"cell_type": "code",
- "execution_count": 5,
+ "execution_count": 21,
"metadata": {
"id": "vzJ8v99U5gb3"
},
@@ -345,7 +349,7 @@
},
{
"cell_type": "code",
- "execution_count": 6,
+ "execution_count": 22,
"metadata": {
"id": "LJ15Z5Lh5ktg"
},
@@ -374,7 +378,7 @@
},
{
"cell_type": "code",
- "execution_count": 7,
+ "execution_count": 27,
"metadata": {
"id": "m-YR2KhBAR8Y"
},
@@ -404,7 +408,7 @@
"\n",
"\n",
"# Initialize LLM\n",
- "MODEL_NAME = \"gpt-4o\"\n",
+ "MODEL_NAME = \"gpt-4o-mini\"\n",
"llm = ChatOpenAI(model=MODEL_NAME, temperature=0)\n",
"\n",
"# Create Agent Factory instance\n",
@@ -422,7 +426,7 @@
},
{
"cell_type": "code",
- "execution_count": 8,
+ "execution_count": 28,
"metadata": {
"id": "f1UeOGUMAjPz"
},
@@ -447,7 +451,7 @@
},
{
"cell_type": "code",
- "execution_count": 9,
+ "execution_count": 29,
"metadata": {
"id": "_XlLBl1HAmSQ"
},
@@ -512,7 +516,7 @@
},
{
"cell_type": "code",
- "execution_count": 10,
+ "execution_count": 30,
"metadata": {
"id": "sUHIgF8ZAt4n"
},
@@ -570,7 +574,7 @@
},
{
"cell_type": "code",
- "execution_count": 11,
+ "execution_count": 31,
"metadata": {
"id": "e-JnX3jvA5Us"
},
@@ -593,49 +597,18 @@
},
{
"cell_type": "code",
- "execution_count": 12,
+ "execution_count": 40,
"metadata": {},
"outputs": [],
"source": [
- "from IPython.display import Image, display\n",
+ "from langchain_opentutorial.graphs import visualize_graph\n",
"from langgraph.graph import StateGraph\n",
- "from langgraph.graph.state import CompiledStateGraph\n",
- "from langgraph.checkpoint.memory import MemorySaver\n",
- "from dataclasses import dataclass\n",
- "\n",
- "\n",
- "# Define node styles\n",
- "@dataclass\n",
- "class NodeStyles:\n",
- " default: str = (\n",
- " \"fill:#45C4B0, fill-opacity:0.3, color:#23260F, stroke:#45C4B0, stroke-width:1px, font-weight:bold, line-height:1.2\"\n",
- " )\n",
- " first: str = (\n",
- " \"fill:#45C4B0, fill-opacity:0.1, color:#23260F, stroke:#45C4B0, stroke-width:1px, font-weight:normal, font-style:italic, stroke-dasharray:2,2\"\n",
- " )\n",
- " last: str = (\n",
- " \"fill:#45C4B0, fill-opacity:1, color:#000000, stroke:#45C4B0, stroke-width:1px, font-weight:normal, font-style:italic, stroke-dasharray:2,2\"\n",
- " )\n",
- "\n",
- "\n",
- "def visualize_graph(graph, xray=False):\n",
- " try:\n",
- " if isinstance(graph, CompiledStateGraph):\n",
- " display(\n",
- " Image(\n",
- " graph.get_graph(xray=xray).draw_mermaid_png(\n",
- " background_color=\"white\",\n",
- " node_colors=NodeStyles(),\n",
- " )\n",
- " )\n",
- " )\n",
- " except Exception as e:\n",
- " print(f\"[ERROR] Visualize Graph Error: {e}\")"
+ "from langgraph.checkpoint.memory import MemorySaver"
]
},
{
"cell_type": "code",
- "execution_count": 13,
+ "execution_count": 36,
"metadata": {
"id": "qcwMe2OoKwc3"
},
@@ -692,59 +665,12 @@
},
{
"cell_type": "code",
- "execution_count": 14,
+ "execution_count": 41,
"metadata": {},
"outputs": [],
"source": [
- "import uuid\n",
- "from typing import Dict, Any\n",
"from langchain_core.runnables import RunnableConfig\n",
- "from langchain_core.messages import HumanMessage\n",
- "from langchain_core.messages import BaseMessage\n",
- "\n",
- "\n",
- "def random_uuid() -> str:\n",
- " \"\"\"Generate a random UUID string.\"\"\"\n",
- " return str(uuid.uuid4())\n",
- "\n",
- "\n",
- "def pretty_print_message(message: BaseMessage) -> None:\n",
- " \"\"\"Print message with role-based formatting.\"\"\"\n",
- " role = \"user\" if isinstance(message, HumanMessage) else \"assistant\"\n",
- " print(f\"\\n{role}: {message.content}\")\n",
- "\n",
- "\n",
- "def format_namespace(namespace: list) -> str:\n",
- " \"\"\"Format namespace for display.\"\"\"\n",
- " return namespace[-1].split(\":\")[0] if namespace else \"root graph\"\n",
- "\n",
- "\n",
- "def display_node_output(node_name: str, namespace: list, chunk: Dict[str, Any]) -> None:\n",
- " \"\"\"Display formatted node output.\"\"\"\n",
- " print(\"\\n\" + \"=\" * 50)\n",
- " formatted_namespace = format_namespace(namespace)\n",
- "\n",
- " if formatted_namespace == \"root graph\":\n",
- " print(f\"🔄 Node: \\033[1;36m{node_name}\\033[0m 🔄\")\n",
- " else:\n",
- " print(\n",
- " f\"🔄 Node: \\033[1;36m{node_name}\\033[0m in [\\033[1;33m{formatted_namespace}\\033[0m] 🔄\"\n",
- " )\n",
- " print(\"- \" * 25)\n",
- "\n",
- " for k, v in chunk.items():\n",
- " if isinstance(v, BaseMessage):\n",
- " pretty_print_message(v)\n",
- " elif isinstance(v, list):\n",
- " for item in v:\n",
- " if isinstance(item, BaseMessage):\n",
- " pretty_print_message(item)\n",
- " else:\n",
- " print(item)\n",
- " elif isinstance(v, dict):\n",
- " for key, value in v.items():\n",
- " print(f\"{key}:\\n{value}\")\n",
- " print(\"=\" * 50)\n",
+ "from langchain_opentutorial.messages import random_uuid, invoke_graph\n",
"\n",
"\n",
"def run_graph(app, message: str, recursive_limit: int = 50):\n",
@@ -759,18 +685,14 @@
" }\n",
"\n",
" # Execute graph and display output\n",
- " for namespace, chunk in app.stream(\n",
- " inputs, config, stream_mode=\"updates\", subgraphs=True\n",
- " ):\n",
- " for node_name, node_chunk in chunk.items():\n",
- " display_node_output(node_name, namespace, node_chunk)\n",
+ " invoke_graph(app, inputs, config)\n",
"\n",
" return app.get_state(config).values"
]
},
{
"cell_type": "code",
- "execution_count": 15,
+ "execution_count": 42,
"metadata": {
"id": "pUIWVFjOLQoC"
},
@@ -783,20 +705,28 @@
"==================================================\n",
"🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "WebScraper\n",
"==================================================\n",
"\n",
"==================================================\n",
"🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mWebScraper\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "\n",
- "assistant: \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
+ "Tool Calls:\n",
+ " scrape_webpages (call_1Z7X9foecra9LF4NABoCmYw9)\n",
+ " Call ID: call_1Z7X9foecra9LF4NABoCmYw9\n",
+ " Args:\n",
+ " urls: ['https://finance.yahoo.com/']\n",
"==================================================\n",
"\n",
"==================================================\n",
"🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mWebScraper\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: scrape_webpages\n",
"\n",
- "assistant: \n",
+ "\n",
"\n",
"\n",
"\n",
@@ -862,7 +792,7 @@
"\n",
"\n",
"\n",
- " News Today's news US Politics World Tech Reviews and deals Audio Computing Gaming Health Home Phones Science TVs Climate change Health Science Originals The 360 Newsletters Life Health COVID-19 Fall allergies Health news Mental health Relax Sexual health Studies The Unwind Parenting Family health So mini ways Style and beauty It Figures Unapologetically Horoscopes Shopping Buying guides Food Travel Autos Gift ideas Buying guides Entertainment Celebrity TV Movies Music How to Watch Interviews Videos Finance My Portfolio News Latest Stock Market Originals The Morning Brief Economies Earnings Tech Housing Crypto Biden Economy Markets Stocks: Most Actives Stocks: Gainers Stocks: Losers Trending Tickers Futures World Indices US Treasury Bonds Rates Currencies Crypto Top ETFs Top Mutual Funds Options: Highest Open Interest Options: Highest Implied Volatility Sectors Basic Materials Communication Services Consumer Cyclical Consumer Defensive Energy Financial Services Healthcare Industrials Real Estate Technology Utilities Research Screeners Calendar Stock Comparison Advanced Chart Currency Converter Personal Finance Credit Cards Banking Best HYSA Student Loans Personal Loans Insurance Mortgages Mortgage Calculator Taxes Videos Latest Editor's Picks Investing Insights Trending Stocks All Shows Morning Brief Opening Bid Wealth Invest ETF Report YF Awards 2024 Streaming Now Sports Fantasy News Fantasy football Best Ball Pro Pick 'Em College Pick 'Em Fantasy baseball Fantasy hockey Fantasy basketball Download the app Daily fantasy NFL News Scores and schedules Standings Stats Teams Players Drafts Injuries Odds Super Bowl GameChannel Videos MLB News Scores and schedules Standings Stats Teams Players Odds Videos World Baseball Classic NBA News Draft Scores and schedules Standings Stats Teams Players Injuries Videos Odds Playoffs NHL News Scores and schedules Standings Stats Teams Players Odds Playoffs Soccer News Scores and schedules Premier League MLS NWSL Liga MX CONCACAF League Champions League La Liga Serie A Bundesliga Ligue 1 World Cup College football News Scores and schedules Standings Rankings Stats Teams Show all MMA WNBA Sportsbook NCAAF Tennis Golf NASCAR NCAAB NCAAW Boxing USFL Cycling Motorsports Olympics Horse racing GameChannel Rivals Newsletters Podcasts Videos RSS Jobs Help World Cup More news New on Yahoo Creators Games Tech Terms Privacy Privacy Dashboard Feedback © 2025 All rights reserved. About our ads Advertising Careers Yahoo Finance Yahoo Finance Search query Select edition USEnglish US y LATAMEspañol AustraliaEnglish CanadaEnglish CanadaFrançais DeutschlandDeutsch FranceFrançais 香港繁中 MalaysiaEnglish New ZealandEnglish SingaporeEnglish 台灣繁中 UKEnglish News Finance Sports More News Today's news US Politics World Weather Climate change Health Science Originals Newsletters Life Health Parenting Style and beauty Horoscopes Shopping Food Travel Autos Gift ideas Buying guides Entertainment Celebrity TV Movies Music How to watch Interviews Videos Finance My portfolio Watchlists Markets News Videos Screeners Personal finance Crypto Industries Sports Fantasy NFL NBA MLB NHL College football College basketball Soccer MMA Yahoo Sports AM New on Yahoo Creators Games Tech Selected edition USEnglish Mail Sign in My Portfolio News Latest Stock Market Originals The Morning Brief Economies Earnings Tech Housing Crypto Biden Economy Markets Stocks: Most Actives Stocks: Gainers Stocks: Losers Trending Tickers Futures World Indices US Treasury Bonds Rates Currencies Crypto Top ETFs Top Mutual Funds Options: Highest Open Interest Options: Highest Implied Volatility Sectors Basic Materials Communication Services Consumer Cyclical Consumer Defensive Energy Financial Services Healthcare Industrials Real Estate Technology Utilities Research Screeners Calendar Stock Comparison Advanced Chart Currency Converter Personal Finance Credit Cards Banking Best HYSA Student Loans Personal Loans Insurance Mortgages Mortgage Calculator Taxes Videos Latest Editor's Picks Investing Insights Trending Stocks All Shows Morning Brief Opening Bid Wealth Invest ETF Report YF Awards 2024 Streaming Now … BofA chief: The Fed may have to respond to Trump Bank of America CEO Brian Moynihan says the Fed needs to start asking itself a different question. Yahoo Finance • 2 hours ago PNC +0.59% BAC +0.51% Stocks rise as Trump holds off on tariff bazooka LIVE Yahoo Finance • 41 seconds ago ES=F +0.53% ^GSPC +0.39% Elon Musk stamps place in Trump's orbit during inauguration Associated Press Finance • 13 hours ago Apple's tough January worsens as analysts cut on iPhone weakness Bloomberg • 1 hour ago AAPL -3.44% Trump eyes historic 25% tariffs on Canada and Mexico Yahoo Finance • 12 hours ago Trump, Melania coins are 'gambling tokens': Anthony Scaramucci Yahoo Finance • 1 hour ago HOOD +0.97% DJT -9.34% Bitcoin holds retreat as Trump omits crypto from Day 1 orders Bloomberg • 2 hours ago Apple's iPhone sales in China plunged 18% in holiday quarter Bloomberg • 9 hours ago AAPL -3.44% Trump unleashes a wave of first-day orders Yahoo Finance • 12 hours ago DJT -9.34% DJTWW -12.70% Schwab rises after earnings top estimates on record inflows Bloomberg • 1 hour ago SCHW +6.60% View More Latest Exclusive-International Paper to win EU approval for $7.1 billion DS Smith deal Reuters • 22 minutes ago IP +1.10% DSSMY 0.00% US energy mergers may slow in 2025 as deal sizes shrink, says Enverus Reuters • 25 minutes ago France Gets Record Demand for Bond Sale After Long Absence Bloomberg • 35 minutes ago Goldman Sachs Promotes Its Next Generation of Leaders The Wall Street Journal • 37 minutes ago GSC +0.68% Goldman Sachs shuffles management to strengthen Wall Street businesses Reuters • 37 minutes ago GSC +0.68% The Risks of Quantum Computing to Cryptocurrency, Bitcoin, and Blockchain TheStreet • 38 minutes ago Climate shift to help boost StanChart income by almost $1 billion, CEO says Reuters • 38 minutes ago SCBFF 0.00% Milei ‘The Madman’ Is Back in Davos After Racking Up Victories in Argentina Bloomberg • 40 minutes ago Popular Goldman Picks Leaders to Run Wall Street Engines in Major Revamp Bloomberg • 1 hour ago GSC +0.68% Kohl’s Shoppers Want Its New CEO to Undo What the Last One Did The Wall Street Journal • yesterday ETSY -0.23% KSS +3.73% Guggenheim expects Fed to cut rates about every quarter in 2025 Reuters • 21 hours ago ^GSPC +0.39% Who Was Better For The Stock Market? Biden Or Trump? Investor's Business Daily • 16 hours ago Stock market today: Dow, S&P 500, Nasdaq rise with investors on Trump tariff watch LIVE Yahoo Finance • 41 seconds ago ES=F +0.53% ^GSPC +0.39% Weekly Stock List Argus Research • 2 hours ago ILMN +0.76% MS -1.17% Bitcoin Hits Record, Dollar Falls on Trump’s Inauguration Day The Wall Street Journal • 20 hours ago ^HSI +0.91% Stocks rise on as Trump starts his first full day in office: Yahoo Finance It's a big day for both Washington and Wall Street. President Donald Trump is kicking off his first full day in office. Trump announced a number of executive actions last night, including the creation of the Elon Musk-led Department of Government Efficiency. One thing Trump did not do was enact new tariffs, though he did threaten to impose tariffs of 25% on goods from Mexico and Canada on February 1. The delay has some investors breathing a sigh of relief, with stocks moving higher in early trading. Some of the trending tickers on Yahoo Finance include Tesla (TSLA), Nvidia (NVDA), and Apple (AAPL). 3d ago 02:52 Trump signs executive actions Yahoo Finance Video • 13h ago 07:52 Trump 2.0 is 'great news' for small & midsize businesses: BofA CEO Yahoo Finance Video • 2h ago BAC +0.51% 01:35 Trump executive actions, tariffs, bitcoin slips: 3 Things Yahoo Finance Video • 28m ago ES=F +0.53% YM=F +0.47% 01:49 Fed 'has to think about' inflation and Trump's proposed tariffs Yahoo Finance Video • 1h ago BAC +0.51% View More Investment Ideas Build Your Wealth View More More News Copyright © 2025 Yahoo. All rights reserved. POPULAR QUOTES Dow Jones S&P 500 DAX Index Nvidia Tesla DJT EXPLORE MORE Mortgages Credit Cards Sectors Crypto Heatmap Biden Economy Financial News ABOUT Data Disclaimer Help Feedback Sitemap Licensing What's New About Our Ads Terms and Privacy Policy Privacy Dashboard U.S. markets close in 6h 20m US Europe Asia Rates Commodities Currencies Cryptocurrencies S&P 500 6,019.81 +23.15 (+0.39%) Dow 30 43,684.00 +196.17 (+0.45%) Nasdaq 19,635.90 +5.70 (+0.03%) Russell 2000 2,275.88 +9.09 (+0.40%) Crude Oil 76.08 -1.80 (-2.31%) Gold 2,742.20 -6.50 (-0.24%) My Portfolios My Portfolios Sign in to access your portfolio Sign in Top Gainers TEM Tempus AI, Inc 43.07 +7.92 (+22.53%) RGTI Rigetti Computing, Inc. 12.02 +2.19 (+22.26%) RKLB Rocket Lab USA, Inc. 27.76 +3.76 (+15.67%) GH Guardant Health, Inc. 41.56 +4.90 (+13.37%) OKLO Oklo Inc. 29.50 +3.45 (+13.24%) Top Losers FTAI FTAI Aviation Ltd. 81.26 -31.12 (-27.69%) EDU New Oriental Education & Technology Group Inc. 45.67 -15.17 (-24.96%) DJTWW Trump Media & Technology Group Corp. 24.80 -4.18 (-14.42%) DJT Trump Media & Technology Group Corp. 35.32 -4.71 (-11.76%) IREN IREN Limited 10.80 -1.19 (-9.92%) Most Active RGTI Rigetti Computing, Inc. 12.02 +2.19 (+22.26%) NVDA NVIDIA Corporation 139.35 +1.64 (+1.19%) INTC Intel Corporation 22.22 +0.73 (+3.37%) TSLA Tesla, Inc. 417.27 -9.23 (-2.16%) AAPL Apple Inc. 222.14 -7.84 (-3.41%) Trending Tickers AAPL Apple Inc. 222.12 -7.86 (-3.42%) TSLA Tesla, Inc. 414.67 -11.83 (-2.77%) NVDA NVIDIA Corporation 137.84 +0.12 (+0.09%) PLTR Palantir Technologies Inc. 72.45 +0.68 (+0.95%) TEM Tempus AI, Inc 42.78 +7.62 (+21.69%) Top Economic Events Terms and Privacy Policy Privacy Dashboard Ad Terms Feedback \n",
+ " News Today's news US Politics World Tech Reviews and deals Audio Computing Gaming Health Home Phones Science TVs Climate change Health Science Originals The 360 Newsletters Life Health COVID-19 Fall allergies Health news Mental health Relax Sexual health Studies The Unwind Parenting Family health So mini ways Style and beauty It Figures Unapologetically Horoscopes Shopping Buying guides Food Travel Autos Gift ideas Buying guides Entertainment Celebrity TV Movies Music How to Watch Interviews Videos Finance My Portfolio News Latest Stock Market Originals The Morning Brief Economies Earnings Tech Housing Crypto Markets Stocks: Most Actives Stocks: Gainers Stocks: Losers Trending Tickers Futures World Indices US Treasury Bonds Rates Currencies Crypto Top ETFs Top Mutual Funds Options: Highest Open Interest Options: Highest Implied Volatility Sectors Basic Materials Communication Services Consumer Cyclical Consumer Defensive Energy Financial Services Healthcare Industrials Real Estate Technology Utilities Research Screeners Calendar Stock Comparison Advanced Chart Currency Converter Personal Finance Credit Cards Banking Best HYSA Student Loans Personal Loans Insurance Mortgages Mortgage Calculator Taxes Videos Latest Editor's Picks Investing Insights Trending Stocks All Shows Morning Brief Opening Bid Wealth ETF Report Options 101 Davos 2025 Streaming Now Sports Fantasy News Fantasy football Best Ball Pro Pick 'Em College Pick 'Em Fantasy baseball Fantasy hockey Fantasy basketball Download the app Daily fantasy NFL News Scores and schedules Standings Stats Teams Players Drafts Injuries Odds Super Bowl GameChannel Videos MLB News Scores and schedules Standings Stats Teams Players Odds Videos World Baseball Classic NBA News Draft Scores and schedules Standings Stats Teams Players Injuries Videos Odds Playoffs NHL News Scores and schedules Standings Stats Teams Players Odds Playoffs Soccer News Scores and schedules Premier League MLS NWSL Liga MX CONCACAF League Champions League La Liga Serie A Bundesliga Ligue 1 World Cup College football News Scores and schedules Standings Rankings Stats Teams Show all MMA WNBA Sportsbook NCAAF Tennis Golf NASCAR NCAAB NCAAW Boxing USFL Cycling Motorsports Olympics Horse racing GameChannel Rivals Newsletters Podcasts Videos RSS Jobs Help World Cup More news New on Yahoo Creators Games Tech Local Services Terms Privacy Privacy Dashboard Feedback © 2025 All rights reserved. About our ads Advertising Careers Yahoo Finance Yahoo Finance Search query Select edition USEnglish US y LATAMEspañol AustraliaEnglish CanadaEnglish CanadaFrançais DeutschlandDeutsch FranceFrançais 香港繁中 MalaysiaEnglish New ZealandEnglish SingaporeEnglish 台灣繁中 UKEnglish News Finance Sports More News Today's news US Politics World Weather Climate change Health Science Originals Newsletters Life Health Parenting Style and beauty Horoscopes Shopping Food Travel Autos Gift ideas Buying guides Entertainment Celebrity TV Movies Music How to watch Interviews Videos Finance My portfolio Watchlists Markets News Videos Screeners Personal finance Crypto Industries Sports Fantasy NFL NBA MLB NHL College football College basketball Soccer MMA Yahoo Sports AM New on Yahoo Creators Games Tech Local Services Selected edition USEnglish Mail Sign in My Portfolio News Latest Stock Market Originals The Morning Brief Economies Earnings Tech Housing Crypto Markets Stocks: Most Actives Stocks: Gainers Stocks: Losers Trending Tickers Futures World Indices US Treasury Bonds Rates Currencies Crypto Top ETFs Top Mutual Funds Options: Highest Open Interest Options: Highest Implied Volatility Sectors Basic Materials Communication Services Consumer Cyclical Consumer Defensive Energy Financial Services Healthcare Industrials Real Estate Technology Utilities Research Screeners Calendar Stock Comparison Advanced Chart Currency Converter Personal Finance Credit Cards Banking Best HYSA Student Loans Personal Loans Insurance Mortgages Mortgage Calculator Taxes Videos Latest Editor's Picks Investing Insights Trending Stocks All Shows Morning Brief Opening Bid Wealth ETF Report Options 101 Davos 2025 Streaming Now … Trump's 'one very big power over China' could seal the fate of TikTok President Donald Trump indicated that imposing tariffs on China could be an outcome of a TikTok deal falling through. Yahoo Finance • 8 hours ago ^DJI -0.32% AI-driven power boom will drive demand 38% higher on top US grid Bloomberg • 2 hours ago Banks prepare to offload as much as $3 billion of X buyout debt Bloomberg • 2 hours ago Stocks stall but cap winning week as Trump's remarks move markets Yahoo Finance • 4 hours ago ^IXIC -0.50% ES=F -0.33% Ray Dalio: It's time for society to think about alternative money Yahoo Finance • 6 hours ago DX=F -0.56% BofA, JPMorgan to lobby DC after conservative criticism Reuters • 2 hours ago BAC +0.28% Target ending DEI initiatives amid Trump order on diversity programs Reuters • 2 hours ago TGT +0.47% Apple tests key technical level in worst start since 2008 Bloomberg • 4 hours ago AAPL -0.39% ^GSPC -0.29% Bankrupt Purdue buys time to advance $7.4 billion opioid deal Reuters • 4 hours ago Tesla launches refreshed Model Y in bid to boost sales Yahoo Finance • 10 hours ago TSLA -1.41% View More Latest In motion to dismiss, chatbot platform Character AI claims it is protected by the First Amendment TechCrunch • 1 hour ago Meta reveals how it plans to make money on Threads, its X competitor with 300 million users Business Insider • 1 hour ago Banks Prepare to Sell as Much as $3 Billion of X Buyout Debt Bloomberg • 2 hours ago AI-Driven Power Boom Will Drive Demand 38% Higher on Top US Grid Bloomberg • 2 hours ago Did Trump's Crypto Order Deliver On His Promise Of A Bitcoin Strategic Reserve? Investopedia • 2 hours ago Meta Plans to Spend as Much as $65B in 'Defining Year for AI' Says Zuckerberg Investopedia • 2 hours ago META +1.73% Existing Home Sales Hit a Multidecade Low. Why Are Economists Upbeat About the Housing Market? Investopedia • 2 hours ago Texas Instruments Stock Leads Analog Chip Shares Lower Investopedia • 2 hours ago TXN -7.52% ADI -4.81% Popular Tesla Just Raised Prices in a Key Market. What It Means for the Stock. Barrons.com • yesterday TSLA -1.41% Lightspeed’s $2 Billion Anthropic Megadeal Cements VC Firm’s AI Ambitions Bloomberg • 5 hours ago As Apple Earnings Near, More Investors Think the iPhone Maker’s Stock Is Primed to Fall Barrons.com • yesterday AAPL -0.39% American Express Earnings Show Its Card Members Keep Spending. Why the Stock Is Falling. Barrons.com • 11 hours ago AXP -1.39% Crypto's charms leave some investors in Davos cold Reuters • yesterday Analyst Report: Regions Financial Corporation Morningstar Research • 4 hours ago RF +0.87% RF-PE -0.49% MicroStrategy Suddenly Has a Tax Problem, and Needs Help From Trump’s IRS The Wall Street Journal • 15 hours ago BRK-B +0.73% MSTR -5.21% Big Tech earnings, Fed rate decision: What to Watch Next Week Market Domination Overtime hosts Julie Hyman and Josh Lipton detail the top market events on the horizon for the upcoming week. A slew of Big Tech earnings is scheduled for release, with industry giants Tesla (TSLA), Microsoft (MSFT), Meta Platforms (META), and Apple (AAPL) preparing to unveil their quarterly results. Artificial intelligence will be a central theme in these earnings reports. Additionally, the Federal Reserve's January Federal Open Market Committee (FOMC) meeting will commence on Tuesday, with the central bank's interest rate decision set to be announced on Wednesday at 2 p.m. ET. To watch more expert insights and analysis on the latest market action, check out more Market Domination Overtime here. This post was written by Angel Smith 2h ago META +1.73% TSLA -1.41% 01:01 President Trump, Davos 2025, Netflix's Q4 earnings: ICYMI Yahoo Finance Video • 2h ago 05:40 Fed will do 'whatever it takes' to hit dual mandate Yahoo Finance Video • 2h ago ^GSPC -0.29% ^DJI -0.32% 03:08 Strong US dollar is sending Americans to Europe. Here's why. Yahoo Finance Video • 2h ago DX=F -0.56% DX-Y.NYB -0.54% 03:14 Stargate & OpenAI announcement is 'net positive' for Microsoft Yahoo Finance Video • 3h ago MSFT -0.59% View More Investment Ideas Build Your Wealth View More More News Copyright © 2025 Yahoo. All rights reserved. POPULAR QUOTES Dow Jones S&P 500 DAX Index Nvidia Tesla DJT EXPLORE MORE Mortgages Credit Cards Sectors Crypto Heatmap Financial News ABOUT Data Disclaimer Help Feedback Sitemap Licensing What's New About Our Ads Terms and Privacy Policy Privacy Dashboard U.S. markets closed US Europe Asia Rates Commodities Currencies Cryptocurrencies S&P 500 6,101.24 -17.47 (-0.29%) Dow 30 44,424.25 -140.82 (-0.32%) Nasdaq 19,954.30 -99.38 (-0.50%) Russell 2000 2,307.74 -6.86 (-0.30%) Crude Oil 74.60 -0.02 (-0.03%) Gold 2,777.40 +12.40 (+0.45%) My Portfolios My Portfolios Sign in to access your portfolio Sign in Top Gainers TWLO Twilio Inc. 136.23 +22.83 (+20.13%) MIDD The Middleby Corporation 169.72 +24.07 (+16.53%) SGHC Super Group (SGHC) Limited 7.90 +1.09 (+16.01%) DAN Dana Incorporated 15.20 +1.97 (+14.89%) KC Kingsoft Cloud Holdings Limited 10.79 +1.25 (+13.10%) Top Losers ERIC Telefonaktiebolaget LM Ericsson (publ) 7.62 -1.28 (-14.38%) FNMA Federal National Mortgage Association 5.73 -0.77 (-11.85%) FTAI FTAI Aviation Ltd. 90.98 -10.52 (-10.36%) FMCC Federal Home Loan Mortgage Corporation 5.24 -0.44 (-7.75%) TXN Texas Instruments Incorporated 185.52 -15.09 (-7.52%) Most Active RGTI Rigetti Computing, Inc. 13.20 -0.27 (-2.00%) NVDA NVIDIA Corporation 142.62 -4.60 (-3.12%) LCID Lucid Group, Inc. 2.7900 +0.0700 (+2.57%) PLTR Palantir Technologies Inc. 78.98 0.00 (0.00%) BBD Banco Bradesco S.A. 1.9300 -0.0100 (-0.52%) Trending Tickers NVNI Nvni Group Limited 4.4000 +2.9100 (+195.30%) GOEV Canoo Inc. 0.4750 +0.1450 (+43.94%) NVO Novo Nordisk A/S 87.97 +6.87 (+8.47%) VG Vonage Holdings Corp. 24.00 -1.00 (-4.00%) MSTR MicroStrategy Incorporated 353.67 -19.45 (-5.21%) Top Economic Events Terms and Privacy Policy Privacy Dashboard Ad Terms Feedback \n",
"\n",
"\n",
"\n",
@@ -907,64 +837,65 @@
"==================================================\n",
"🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mWebScraper\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"\n",
- "assistant: Here are the main news highlights from Yahoo Finance:\n",
- "\n",
- "1. **Federal Reserve and Trump**: Bank of America CEO Brian Moynihan suggests that the Federal Reserve may need to reconsider its approach in response to actions by former President Trump. [Source](https://finance.yahoo.com/)\n",
+ "Here are the main news highlights from Yahoo Finance:\n",
"\n",
- "2. **Stock Market Movements**: Stocks have risen as Trump holds off on implementing significant tariffs. [Source](https://finance.yahoo.com/)\n",
+ "1. **Trump's Influence on TikTok**: Former President Donald Trump suggested that imposing tariffs on China could be a consequence if a deal regarding TikTok fails. [Source](https://finance.yahoo.com/)\n",
"\n",
- "3. **Elon Musk and Trump**: Elon Musk has positioned himself within Trump's circle during the inauguration. [Source](https://finance.yahoo.com/)\n",
+ "2. **AI-Driven Power Demand**: A report indicates that demand for power on the US grid is expected to rise by 38% due to advancements in AI technology. [Source](https://finance.yahoo.com/)\n",
"\n",
- "4. **Apple's Challenges**: Apple's January performance has worsened due to analysts cutting forecasts based on iPhone sales weakness. [Source](https://finance.yahoo.com/)\n",
+ "3. **Bank Debt Offloading**: Banks are preparing to sell up to $3 billion in debt related to the acquisition of X, the social media platform. [Source](https://finance.yahoo.com/)\n",
"\n",
- "5. **Trump's Tariff Plans**: Trump is considering imposing historic 25% tariffs on Canada and Mexico. [Source](https://finance.yahoo.com/)\n",
+ "4. **Market Reactions to Trump's Remarks**: Stock markets experienced a stall but capped a winning week following Trump's comments, which influenced market movements. [Source](https://finance.yahoo.com/)\n",
"\n",
- "6. **Cryptocurrency and Trump**: Bitcoin has retreated as Trump did not include cryptocurrency in his initial executive orders. [Source](https://finance.yahoo.com/)\n",
+ "5. **Ray Dalio on Alternative Money**: Investor Ray Dalio emphasized the need for society to consider alternative forms of money. [Source](https://finance.yahoo.com/)\n",
"\n",
- "7. **Schwab's Earnings**: Schwab's earnings have exceeded estimates due to record inflows. [Source](https://finance.yahoo.com/)\n",
+ "6. **Target's Diversity Initiatives**: Target is set to end its diversity, equity, and inclusion initiatives in response to an executive order from Trump. [Source](https://finance.yahoo.com/)\n",
"\n",
- "8. **International Paper Deal**: International Paper is set to receive EU approval for a $7.1 billion deal with DS Smith. [Source](https://finance.yahoo.com/)\n",
+ "7. **Apple's Stock Performance**: Apple is testing a critical technical level as it faces its worst stock performance start since 2008. [Source](https://finance.yahoo.com/)\n",
"\n",
- "9. **US Energy Mergers**: There is an expectation that US energy mergers may slow down in 2025 as deal sizes shrink. [Source](https://finance.yahoo.com/)\n",
+ "8. **Purdue Pharma's Bankruptcy**: Purdue Pharma is seeking to advance a $7.4 billion settlement related to its opioid crisis bankruptcy. [Source](https://finance.yahoo.com/)\n",
"\n",
- "10. **Goldman Sachs Leadership**: Goldman Sachs is promoting its next generation of leaders and shuffling management to strengthen its Wall Street businesses. [Source](https://finance.yahoo.com/)\n",
+ "9. **Tesla's Model Y Refresh**: Tesla has launched a refreshed version of its Model Y in an effort to boost sales. [Source](https://finance.yahoo.com/)\n",
"\n",
- "These summaries provide a snapshot of the current financial news and market trends as reported by Yahoo Finance.\n",
+ "For more details, you can visit [Yahoo Finance](https://finance.yahoo.com/).\n",
"==================================================\n",
"\n",
"==================================================\n",
"🔄 Node: \u001b[1;36mWebScraper\u001b[0m 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "================================\u001b[1m Human Message \u001b[0m=================================\n",
+ "Name: WebScraper\n",
"\n",
- "user: Here are the main news highlights from Yahoo Finance:\n",
- "\n",
- "1. **Federal Reserve and Trump**: Bank of America CEO Brian Moynihan suggests that the Federal Reserve may need to reconsider its approach in response to actions by former President Trump. [Source](https://finance.yahoo.com/)\n",
+ "Here are the main news highlights from Yahoo Finance:\n",
"\n",
- "2. **Stock Market Movements**: Stocks have risen as Trump holds off on implementing significant tariffs. [Source](https://finance.yahoo.com/)\n",
+ "1. **Trump's Influence on TikTok**: Former President Donald Trump suggested that imposing tariffs on China could be a consequence if a deal regarding TikTok fails. [Source](https://finance.yahoo.com/)\n",
"\n",
- "3. **Elon Musk and Trump**: Elon Musk has positioned himself within Trump's circle during the inauguration. [Source](https://finance.yahoo.com/)\n",
+ "2. **AI-Driven Power Demand**: A report indicates that demand for power on the US grid is expected to rise by 38% due to advancements in AI technology. [Source](https://finance.yahoo.com/)\n",
"\n",
- "4. **Apple's Challenges**: Apple's January performance has worsened due to analysts cutting forecasts based on iPhone sales weakness. [Source](https://finance.yahoo.com/)\n",
+ "3. **Bank Debt Offloading**: Banks are preparing to sell up to $3 billion in debt related to the acquisition of X, the social media platform. [Source](https://finance.yahoo.com/)\n",
"\n",
- "5. **Trump's Tariff Plans**: Trump is considering imposing historic 25% tariffs on Canada and Mexico. [Source](https://finance.yahoo.com/)\n",
+ "4. **Market Reactions to Trump's Remarks**: Stock markets experienced a stall but capped a winning week following Trump's comments, which influenced market movements. [Source](https://finance.yahoo.com/)\n",
"\n",
- "6. **Cryptocurrency and Trump**: Bitcoin has retreated as Trump did not include cryptocurrency in his initial executive orders. [Source](https://finance.yahoo.com/)\n",
+ "5. **Ray Dalio on Alternative Money**: Investor Ray Dalio emphasized the need for society to consider alternative forms of money. [Source](https://finance.yahoo.com/)\n",
"\n",
- "7. **Schwab's Earnings**: Schwab's earnings have exceeded estimates due to record inflows. [Source](https://finance.yahoo.com/)\n",
+ "6. **Target's Diversity Initiatives**: Target is set to end its diversity, equity, and inclusion initiatives in response to an executive order from Trump. [Source](https://finance.yahoo.com/)\n",
"\n",
- "8. **International Paper Deal**: International Paper is set to receive EU approval for a $7.1 billion deal with DS Smith. [Source](https://finance.yahoo.com/)\n",
+ "7. **Apple's Stock Performance**: Apple is testing a critical technical level as it faces its worst stock performance start since 2008. [Source](https://finance.yahoo.com/)\n",
"\n",
- "9. **US Energy Mergers**: There is an expectation that US energy mergers may slow down in 2025 as deal sizes shrink. [Source](https://finance.yahoo.com/)\n",
+ "8. **Purdue Pharma's Bankruptcy**: Purdue Pharma is seeking to advance a $7.4 billion settlement related to its opioid crisis bankruptcy. [Source](https://finance.yahoo.com/)\n",
"\n",
- "10. **Goldman Sachs Leadership**: Goldman Sachs is promoting its next generation of leaders and shuffling management to strengthen its Wall Street businesses. [Source](https://finance.yahoo.com/)\n",
+ "9. **Tesla's Model Y Refresh**: Tesla has launched a refreshed version of its Model Y in an effort to boost sales. [Source](https://finance.yahoo.com/)\n",
"\n",
- "These summaries provide a snapshot of the current financial news and market trends as reported by Yahoo Finance.\n",
+ "For more details, you can visit [Yahoo Finance](https://finance.yahoo.com/).\n",
"==================================================\n",
"\n",
"==================================================\n",
"🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "FINISH\n",
"==================================================\n"
]
}
@@ -978,7 +909,7 @@
},
{
"cell_type": "code",
- "execution_count": 16,
+ "execution_count": 43,
"metadata": {
"id": "z6fQqA6VLT0Y"
},
@@ -989,27 +920,25 @@
"text": [
"Here are the main news highlights from Yahoo Finance:\n",
"\n",
- "1. **Federal Reserve and Trump**: Bank of America CEO Brian Moynihan suggests that the Federal Reserve may need to reconsider its approach in response to actions by former President Trump. [Source](https://finance.yahoo.com/)\n",
+ "1. **Trump's Influence on TikTok**: Former President Donald Trump suggested that imposing tariffs on China could be a consequence if a deal regarding TikTok fails. [Source](https://finance.yahoo.com/)\n",
"\n",
- "2. **Stock Market Movements**: Stocks have risen as Trump holds off on implementing significant tariffs. [Source](https://finance.yahoo.com/)\n",
+ "2. **AI-Driven Power Demand**: A report indicates that demand for power on the US grid is expected to rise by 38% due to advancements in AI technology. [Source](https://finance.yahoo.com/)\n",
"\n",
- "3. **Elon Musk and Trump**: Elon Musk has positioned himself within Trump's circle during the inauguration. [Source](https://finance.yahoo.com/)\n",
+ "3. **Bank Debt Offloading**: Banks are preparing to sell up to $3 billion in debt related to the acquisition of X, the social media platform. [Source](https://finance.yahoo.com/)\n",
"\n",
- "4. **Apple's Challenges**: Apple's January performance has worsened due to analysts cutting forecasts based on iPhone sales weakness. [Source](https://finance.yahoo.com/)\n",
+ "4. **Market Reactions to Trump's Remarks**: Stock markets experienced a stall but capped a winning week following Trump's comments, which influenced market movements. [Source](https://finance.yahoo.com/)\n",
"\n",
- "5. **Trump's Tariff Plans**: Trump is considering imposing historic 25% tariffs on Canada and Mexico. [Source](https://finance.yahoo.com/)\n",
+ "5. **Ray Dalio on Alternative Money**: Investor Ray Dalio emphasized the need for society to consider alternative forms of money. [Source](https://finance.yahoo.com/)\n",
"\n",
- "6. **Cryptocurrency and Trump**: Bitcoin has retreated as Trump did not include cryptocurrency in his initial executive orders. [Source](https://finance.yahoo.com/)\n",
+ "6. **Target's Diversity Initiatives**: Target is set to end its diversity, equity, and inclusion initiatives in response to an executive order from Trump. [Source](https://finance.yahoo.com/)\n",
"\n",
- "7. **Schwab's Earnings**: Schwab's earnings have exceeded estimates due to record inflows. [Source](https://finance.yahoo.com/)\n",
+ "7. **Apple's Stock Performance**: Apple is testing a critical technical level as it faces its worst stock performance start since 2008. [Source](https://finance.yahoo.com/)\n",
"\n",
- "8. **International Paper Deal**: International Paper is set to receive EU approval for a $7.1 billion deal with DS Smith. [Source](https://finance.yahoo.com/)\n",
+ "8. **Purdue Pharma's Bankruptcy**: Purdue Pharma is seeking to advance a $7.4 billion settlement related to its opioid crisis bankruptcy. [Source](https://finance.yahoo.com/)\n",
"\n",
- "9. **US Energy Mergers**: There is an expectation that US energy mergers may slow down in 2025 as deal sizes shrink. [Source](https://finance.yahoo.com/)\n",
+ "9. **Tesla's Model Y Refresh**: Tesla has launched a refreshed version of its Model Y in an effort to boost sales. [Source](https://finance.yahoo.com/)\n",
"\n",
- "10. **Goldman Sachs Leadership**: Goldman Sachs is promoting its next generation of leaders and shuffling management to strengthen its Wall Street businesses. [Source](https://finance.yahoo.com/)\n",
- "\n",
- "These summaries provide a snapshot of the current financial news and market trends as reported by Yahoo Finance.\n"
+ "For more details, you can visit [Yahoo Finance](https://finance.yahoo.com/).\n"
]
}
],
@@ -1030,7 +959,7 @@
},
{
"cell_type": "code",
- "execution_count": 17,
+ "execution_count": 44,
"metadata": {
"id": "NIWgweiPLg0j"
},
@@ -1137,7 +1066,7 @@
},
{
"cell_type": "code",
- "execution_count": 18,
+ "execution_count": 45,
"metadata": {
"id": "EVdwBalILuGx"
},
@@ -1187,7 +1116,7 @@
},
{
"cell_type": "code",
- "execution_count": 19,
+ "execution_count": 46,
"metadata": {
"id": "oax44hlALyP_"
},
@@ -1219,7 +1148,7 @@
},
{
"cell_type": "code",
- "execution_count": 20,
+ "execution_count": 47,
"metadata": {
"id": "uL-dS1MhL1Ak"
},
@@ -1232,406 +1161,684 @@
"==================================================\n",
"🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "DocWriter\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "\n",
- "assistant: \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
+ "Tool Calls:\n",
+ " write_document (call_o7N3RtIZk6sYIyGbr1tahvOi)\n",
+ " Call ID: call_o7N3RtIZk6sYIyGbr1tahvOi\n",
+ " Args:\n",
+ " content: # In-Depth Analysis of the Transformer Architecture\n",
+ "\n",
+ "## Table of Contents\n",
+ "1. Introduction \n",
+ "2. Background \n",
+ " 2.1 Sequence-to-Sequence Models \n",
+ " 2.2 Attention Mechanism \n",
+ "3. Transformer Architecture \n",
+ " 3.1 Overview \n",
+ " 3.2 Encoder-Decoder Structure \n",
+ " 3.3 Multi-Head Attention \n",
+ " 3.4 Positional Encoding \n",
+ "4. Training the Transformer \n",
+ " 4.1 Loss Functions \n",
+ " 4.2 Optimization Techniques \n",
+ "5. Applications of Transformers \n",
+ " 5.1 Natural Language Processing \n",
+ " 5.2 Vision Transformers \n",
+ "6. Advantages and Limitations \n",
+ " 6.1 Advantages \n",
+ " 6.2 Limitations \n",
+ "7. Future Directions \n",
+ "8. Conclusion \n",
+ "\n",
+ "## 1. Introduction \n",
+ "The Transformer architecture has revolutionized the field of machine learning, particularly in natural language processing (NLP). Introduced by Vaswani et al. in 2017, the Transformer model leverages self-attention mechanisms to effectively process sequential data. Unlike recurrent neural networks (RNNs) that process data sequentially, Transformers allow for parallelization, significantly improving training times and efficiency. This paper provides an in-depth analysis of the Transformer architecture, detailing its components, training methodologies, applications, and future potential. The goal is to provide a comprehensive understanding of how Transformers function and their impact on various domains.\n",
+ "\n",
+ "## 2. Background \n",
+ "### 2.1 Sequence-to-Sequence Models \n",
+ "Sequence-to-sequence (seq2seq) models are designed to transform an input sequence into an output sequence. Traditionally, these models utilized RNNs to encode the input and decode the output. While effective, RNNs suffer from limitations such as difficulty in capturing long-range dependencies and slower training due to sequential processing. The introduction of the Transformer model addresses these challenges by eliminating recurrence and employing self-attention mechanisms. This enables the model to consider the entire input sequence simultaneously, allowing for better performance on tasks like translation and summarization.\n",
+ "\n",
+ "### 2.2 Attention Mechanism \n",
+ "The attention mechanism is a critical component of the Transformer architecture, enabling the model to focus on specific parts of the input sequence. It assigns different weights to different words in the input based on their relevance to the current decoding step. This allows the model to capture contextual relationships more effectively than traditional methods. Self-attention, a variant of the attention mechanism, calculates attention scores by comparing all elements of the input sequence against each other. This approach facilitates the modeling of dependencies across long distances, making it particularly powerful for language tasks.\n",
+ "\n",
+ "## 3. Transformer Architecture \n",
+ "### 3.1 Overview \n",
+ "The Transformer architecture consists of an encoder-decoder structure, with each part containing multiple layers of self-attention and feed-forward neural networks. The encoder processes the input sequence and generates a set of continuous representations, which are then passed to the decoder to produce the output sequence. Each encoder layer includes a self-attention mechanism followed by a feed-forward network, while the decoder layers include additional mechanisms for cross-attention, allowing them to attend to the encoder's output. This design results in a highly parallelizable architecture that can handle large datasets efficiently.\n",
+ "\n",
+ "### 3.2 Encoder-Decoder Structure \n",
+ "The encoder-decoder structure of the Transformer is fundamental to its operation. The encoder receives the input sequence and transforms it into a continuous representation through multiple layers of self-attention and feed-forward networks. Each encoder layer refines the representation, capturing both local and global dependencies. The decoder, on the other hand, takes this representation and generates the output sequence step-by-step, using masked self-attention to prevent future information from influencing the current prediction. This structured approach allows for effective sequence generation and understanding in various tasks.\n",
+ "\n",
+ "### 3.3 Multi-Head Attention \n",
+ "Multi-head attention is a vital feature of the Transformer architecture, enhancing its ability to focus on different parts of the input simultaneously. By using multiple attention heads, the model can capture various aspects of relationships within the input sequence. Each head operates independently and learns different representations, which are then concatenated and linearly transformed to produce the final output. This mechanism allows the Transformer to aggregate information from multiple perspectives, resulting in richer contextual representations. As a result, multi-head attention significantly improves the model's performance across a wide range of tasks.\n",
+ "\n",
+ "### 3.4 Positional Encoding \n",
+ "Since the Transformer architecture does not inherently understand the order of sequences, positional encoding is introduced to provide information about the position of each word in the sequence. This encoding is added to the input embeddings, allowing the model to distinguish between different positions. Various strategies exist for positional encoding, including sine and cosine functions, which provide a continuous representation of position. By incorporating positional information, the Transformer can effectively model the sequential nature of language, improving its performance on tasks that require understanding of word order.\n",
+ "\n",
+ "## 4. Training the Transformer \n",
+ "### 4.1 Loss Functions \n",
+ "Training the Transformer model involves optimizing a loss function that measures the difference between the predicted outputs and the actual target outputs. Commonly used loss functions for this purpose include cross-entropy loss, which is suitable for classification tasks. The loss is calculated for each predicted token in the output sequence, and the model is trained to minimize this loss using gradient descent techniques. The choice of loss function significantly impacts the model's learning dynamics and final performance, making it a crucial aspect of the training process. Additionally, regularization techniques may be applied to prevent overfitting and enhance generalization.\n",
+ "\n",
+ "### 4.2 Optimization Techniques \n",
+ "Optimizing the Transformer architecture requires advanced techniques to handle its complexity and large parameter space. One popular optimization algorithm is Adam, which combines the benefits of adaptive learning rates and momentum. Learning rate scheduling is also commonly employed to adjust the learning rate dynamically during training, allowing for faster convergence. Techniques such as gradient clipping can be applied to prevent exploding gradients, which can hinder the training process. Overall, effective optimization strategies play a vital role in achieving optimal performance for Transformer models across various tasks.\n",
+ "\n",
+ "## 5. Applications of Transformers \n",
+ "### 5.1 Natural Language Processing \n",
+ "Transformers have become the backbone of many state-of-the-art NLP models, including BERT, GPT, and T5. Their ability to handle long-range dependencies and parallelize training has led to significant advancements in tasks such as machine translation, sentiment analysis, and question answering. These models leverage the self-attention mechanism to understand context better and generate more coherent and contextually relevant outputs. Additionally, Transformers have been successfully fine-tuned for specific tasks, allowing them to achieve remarkable performance with relatively small amounts of task-specific data. As a result, Transformers have largely replaced traditional RNN-based architectures in the NLP domain.\n",
+ "\n",
+ "### 5.2 Vision Transformers \n",
+ "Recently, the Transformer architecture has also been adapted for computer vision tasks, leading to the development of Vision Transformers (ViTs). These models apply the principles of the Transformer architecture to image data, treating image patches as sequences to be processed. By leveraging self-attention, ViTs can capture global representations of images, improving performance on tasks such as image classification and object detection. The success of Vision Transformers highlights the versatility of the architecture, demonstrating its applicability beyond text-based tasks. As a result, Transformers are increasingly being explored in the field of computer vision, opening new avenues for research and application.\n",
+ "\n",
+ "## 6. Advantages and Limitations \n",
+ "### 6.1 Advantages \n",
+ "The Transformer architecture offers several advantages over traditional models, particularly in terms of efficiency and performance. Its ability to process data in parallel significantly speeds up training times, making it suitable for large-scale datasets. The self-attention mechanism enables the model to capture long-range dependencies effectively, leading to improved performance across various tasks. Additionally, the modular structure of Transformers allows for easy scalability and adaptability to different applications. These strengths have contributed to the widespread adoption of Transformers in machine learning research and industry applications.\n",
+ "\n",
+ "### 6.2 Limitations \n",
+ "Despite their many advantages, Transformers also have certain limitations. Their large parameter size can lead to increased computational costs and memory requirements, making them less accessible for smaller organizations or applications with limited resources. Additionally, Transformers may struggle with tasks that require extensive fine-tuning or adaptation to specific domains, as their generalized training can sometimes lead to suboptimal performance. Furthermore, the lack of inherent inductive biases present in traditional convolutional or recurrent models can make Transformers less effective for certain structured data types. Addressing these limitations remains an active area of research.\n",
+ "\n",
+ "## 7. Future Directions \n",
+ "The future of the Transformer architecture holds great promise, with ongoing research focusing on improving its efficiency and applicability. One area of interest is the development of more compact and efficient variants of Transformers, such as Sparse Transformers and Long-Short Transformers, which aim to reduce computational costs while maintaining performance. Additionally, researchers are exploring ways to integrate Transformers with other architectures, such as CNNs or RNNs, to leverage their strengths in hybrid models. Furthermore, the potential for Transformers in areas such as reinforcement learning and multimodal tasks continues to be an exciting avenue for exploration. As the field evolves, Transformers are likely to remain a central focus in advancing machine learning technologies.\n",
+ "\n",
+ "## 8. Conclusion \n",
+ "In summary, the Transformer architecture has transformed the landscape of machine learning, particularly in natural language processing and computer vision. Its innovative use of self-attention mechanisms and encoder-decoder structures has enabled significant advancements in various tasks, paving the way for state-of-the-art models. While the architecture presents certain challenges, its advantages and versatility make it a powerful tool in the machine learning toolkit. As research continues to advance, the future of Transformers looks bright, with potential applications extending beyond current domains. This paper has provided a comprehensive overview of the Transformer architecture, highlighting its importance in modern machine learning.\n",
+ " file_name: transformer_architecture_analysis.md\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: write_document\n",
"\n",
- "assistant: Error: FileNotFoundError(2, 'No such file or directory')\n",
- " Please fix your mistakes.\n",
- "\n",
- "assistant: Error: FileNotFoundError(2, 'No such file or directory')\n",
- " Please fix your mistakes.\n",
+ "Document saved to transformer_architecture_analysis.md\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"\n",
- "assistant: It seems that the documents needed for the analysis are not available. To proceed, I'll create a general outline for a research paper on the Transformer architecture based on common knowledge and existing literature. Once the outline is ready, we can save it.\n",
- "\n",
- "### Table of Contents for a Research Paper on Transformer Architecture\n",
- "\n",
- "1. **Introduction**\n",
- " - Overview of the Transformer architecture.\n",
- " - Importance and impact in the field of natural language processing (NLP).\n",
- " - Brief history and evolution from traditional RNNs and LSTMs.\n",
- "\n",
- "2. **Background and Related Work**\n",
- " - Review of previous models such as RNNs, LSTMs, and GRUs.\n",
- " - Limitations of previous models that led to the development of Transformers.\n",
- " - Key innovations introduced by the Transformer model.\n",
- "\n",
- "3. **Architecture of the Transformer Model**\n",
- " - Detailed explanation of the encoder-decoder structure.\n",
- " - Description of key components: Multi-head attention, positional encoding, feed-forward networks.\n",
- " - Importance of self-attention mechanism.\n",
- "\n",
- "4. **Attention Mechanism in Transformers**\n",
- " - Mathematical formulation and explanation of scaled dot-product attention.\n",
- " - Role of multi-head attention and its benefits.\n",
- " - Visual representation of how attention weights are calculated and used.\n",
- "\n",
- "5. **Positional Encoding**\n",
- " - Need for positional encoding in a model without recurrence.\n",
- " - Mathematical explanation and visualization of sinusoidal positional encodings.\n",
- " - Comparison with learned positional encodings.\n",
- "\n",
- "6. **Training and Optimization Techniques**\n",
- " - Discussion of training techniques like teacher forcing and label smoothing.\n",
- " - Optimization strategies including the Adam optimizer and learning rate scheduling.\n",
- " - Challenges faced during training and methods to overcome them.\n",
- "\n",
- "7. **Applications and Impact**\n",
- " - Usage in NLP tasks like machine translation, text summarization, and language modeling.\n",
- " - Expansion to other fields: vision, audio, and beyond.\n",
- " - Impact on research and industry, including GPT and BERT.\n",
- "\n",
- "8. **Comparative Analysis with Contemporary Models**\n",
- " - Comparison with models like BERT, GPT, and BERT-like architectures.\n",
- " - Advantages and limitations compared to contemporaries.\n",
- " - Evolution and modifications leading to the development of newer models.\n",
- "\n",
- "9. **Future Directions and Research Opportunities**\n",
- " - Exploration of emerging trends and future research areas.\n",
- " - Potential improvements and innovations in Transformer architecture.\n",
- " - Discussion on scalability, efficiency, and real-world deployment.\n",
+ "The in-depth analysis of the Transformer architecture has been completed and saved as \"transformer_architecture_analysis.md\". If you need further modifications or additional sections, feel free to ask!\n",
+ "==================================================\n",
"\n",
- "10. **Conclusion**\n",
- " - Summary of key points discussed in the paper.\n",
- " - Final thoughts on the transformative impact of the Transformer architecture.\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mDocWriter\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "================================\u001b[1m Human Message \u001b[0m=================================\n",
+ "Name: DocWriter\n",
"\n",
- "I will now write detailed content for each section and save the final results.\n",
+ "The in-depth analysis of the Transformer architecture has been completed and saved as \"transformer_architecture_analysis.md\". If you need further modifications or additional sections, feel free to ask!\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "\n",
- "assistant: Outline saved to transformer_architecture_outline.txt\n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "NoteTaker\n",
"==================================================\n",
"\n",
"==================================================\n",
"🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "\n",
- "assistant: The outline for the research paper on the Transformer architecture has been successfully created and saved as \"transformer_architecture_outline.txt\". If you need further details on each section or any charts for explanations, please let me know!\n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
+ "Tool Calls:\n",
+ " create_outline (call_DBmJAAoZMUQoOGlgvDhmYbbm)\n",
+ " Call ID: call_DBmJAAoZMUQoOGlgvDhmYbbm\n",
+ " Args:\n",
+ " points: ['Introduction to Transformer Architecture', 'Key Components of Transformer', 'Self-Attention Mechanism', 'Positional Encoding', 'Multi-Head Attention', 'Feed Forward Neural Networks', 'Layer Normalization and Residual Connections', 'Training Techniques and Optimizations', 'Applications of Transformer Architecture', 'Conclusion']\n",
+ " file_name: transformer_architecture_analysis.md\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36mNoteTaker\u001b[0m 🔄\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: create_outline\n",
"\n",
- "user: The outline for the research paper on the Transformer architecture has been successfully created and saved as \"transformer_architecture_outline.txt\". If you need further details on each section or any charts for explanations, please let me know!\n",
+ "Outline saved to transformer_architecture_analysis.md\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
+ "Tool Calls:\n",
+ " read_document (call_uqxDLfkQy5x3ZuVQkrDgmLus)\n",
+ " Call ID: call_uqxDLfkQy5x3ZuVQkrDgmLus\n",
+ " Args:\n",
+ " file_name: transformer_architecture_analysis.md\n",
+ " start: 0\n",
+ " end: 20\n",
+ " read_document (call_KXWVlmvtpuobQNoDtvb4gpxx)\n",
+ " Call ID: call_KXWVlmvtpuobQNoDtvb4gpxx\n",
+ " Args:\n",
+ " file_name: transformer_architecture_analysis.md\n",
+ " start: 20\n",
+ " end: 40\n",
+ " read_document (call_EEVbcNq3TjyKpJiJwSBXIiUN)\n",
+ " Call ID: call_EEVbcNq3TjyKpJiJwSBXIiUN\n",
+ " Args:\n",
+ " file_name: transformer_architecture_analysis.md\n",
+ " start: 40\n",
+ " end: 60\n",
+ " read_document (call_Y6NskEoYFBpO7o33TvI8BXOe)\n",
+ " Call ID: call_Y6NskEoYFBpO7o33TvI8BXOe\n",
+ " Args:\n",
+ " file_name: transformer_architecture_analysis.md\n",
+ " start: 60\n",
+ " end: 80\n",
+ " read_document (call_VAEEQ2U9S9NqOWsb2BRzpEf1)\n",
+ " Call ID: call_VAEEQ2U9S9NqOWsb2BRzpEf1\n",
+ " Args:\n",
+ " file_name: transformer_architecture_analysis.md\n",
+ " start: 80\n",
+ " end: 100\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: read_document\n",
"\n",
- "assistant: \n",
- "==================================================\n",
+ "1. Introduction to Transformer Architecture\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "2. Key Components of Transformer\n",
"\n",
- "assistant: 1. 1. Introduction\n",
+ "3. Self-Attention Mechanism\n",
"\n",
- "2. 2. Background and Related Work\n",
+ "4. Positional Encoding\n",
"\n",
- "3. 3. Architecture of the Transformer Model\n",
+ "5. Multi-Head Attention\n",
"\n",
- "4. 4. Attention Mechanism in Transformers\n",
+ "6. Feed Forward Neural Networks\n",
"\n",
- "5. 5. Positional Encoding\n",
+ "7. Layer Normalization and Residual Connections\n",
"\n",
- "6. 6. Training and Optimization Techniques\n",
+ "8. Training Techniques and Optimizations\n",
"\n",
- "7. 7. Applications and Impact\n",
+ "9. Applications of Transformer Architecture\n",
"\n",
- "8. 8. Comparative Analysis with Contemporary Models\n",
+ "10. Conclusion\n",
"\n",
- "9. 9. Future Directions and Research Opportunities\n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: read_document\n",
"\n",
- "10. 10. Conclusion\n",
+ "1. Introduction to Transformer Architecture\n",
"\n",
- "==================================================\n",
+ "2. Key Components of Transformer\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "3. Self-Attention Mechanism\n",
"\n",
- "assistant: \n",
- "==================================================\n",
+ "4. Positional Encoding\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "5. Multi-Head Attention\n",
"\n",
- "assistant: Document saved to transformer_architecture_analysis_1.txt\n",
+ "6. Feed Forward Neural Networks\n",
"\n",
- "assistant: Document saved to transformer_architecture_analysis_2.txt\n",
- "==================================================\n",
+ "7. Layer Normalization and Residual Connections\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "8. Training Techniques and Optimizations\n",
"\n",
- "assistant: The in-depth analysis of the Transformer architecture has been successfully written and saved across two documents: \"transformer_architecture_analysis_1.txt\" and \"transformer_architecture_analysis_2.txt\". Here is a brief overview of each section:\n",
+ "9. Applications of Transformer Architecture\n",
"\n",
- "1. **Introduction**: Explores the significance of the Transformer model in natural language processing, highlighting its advantages over sequential models like RNNs, and its role as a foundation for modern AI models.\n",
+ "10. Conclusion\n",
"\n",
- "2. **Background and Related Work**: Discusses the evolution from sequential models to attention mechanisms, leading to the development of the Transformer architecture as a superior alternative.\n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: read_document\n",
"\n",
- "3. **Architecture of the Transformer Model**: Describes the core components of the Transformer, including encoders, decoders, and the role of multi-head self-attention and feed-forward networks.\n",
+ "1. Introduction to Transformer Architecture\n",
"\n",
- "4. **Attention Mechanism in Transformers**: Delves into the self-attention mechanism, explaining how it allows the model to focus on different parts of the input sequence simultaneously.\n",
+ "2. Key Components of Transformer\n",
"\n",
- "5. **Positional Encoding**: Details how positional encoding provides information about the order of words in a sequence, compensating for the absence of recurrence.\n",
+ "3. Self-Attention Mechanism\n",
"\n",
- "6. **Training and Optimization Techniques**: Covers strategies like the Adam optimizer and learning rate scheduling, which are crucial for efficient training and performance.\n",
+ "4. Positional Encoding\n",
"\n",
- "7. **Applications and Impact**: Highlights the wide-ranging applications of Transformers beyond NLP, including computer vision and protein structure prediction.\n",
+ "5. Multi-Head Attention\n",
"\n",
- "8. **Comparative Analysis with Contemporary Models**: Compares the Transformer with other architectures, considering computational efficiency and adaptability.\n",
+ "6. Feed Forward Neural Networks\n",
"\n",
- "9. **Future Directions and Research Opportunities**: Discusses potential areas for improvement and future research, such as reducing computational complexity and exploring new domains.\n",
+ "7. Layer Normalization and Residual Connections\n",
"\n",
- "10. **Conclusion**: Summarizes the transformative impact of the Transformer architecture on AI research and its future potential.\n",
+ "8. Training Techniques and Optimizations\n",
"\n",
- "If you need further details or specific charts for any section, feel free to ask!\n",
- "==================================================\n",
+ "9. Applications of Transformer Architecture\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mDocWriter\u001b[0m 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "10. Conclusion\n",
"\n",
- "user: The in-depth analysis of the Transformer architecture has been successfully written and saved across two documents: \"transformer_architecture_analysis_1.txt\" and \"transformer_architecture_analysis_2.txt\". Here is a brief overview of each section:\n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: read_document\n",
"\n",
- "1. **Introduction**: Explores the significance of the Transformer model in natural language processing, highlighting its advantages over sequential models like RNNs, and its role as a foundation for modern AI models.\n",
+ "1. Introduction to Transformer Architecture\n",
"\n",
- "2. **Background and Related Work**: Discusses the evolution from sequential models to attention mechanisms, leading to the development of the Transformer architecture as a superior alternative.\n",
+ "2. Key Components of Transformer\n",
"\n",
- "3. **Architecture of the Transformer Model**: Describes the core components of the Transformer, including encoders, decoders, and the role of multi-head self-attention and feed-forward networks.\n",
+ "3. Self-Attention Mechanism\n",
"\n",
- "4. **Attention Mechanism in Transformers**: Delves into the self-attention mechanism, explaining how it allows the model to focus on different parts of the input sequence simultaneously.\n",
+ "4. Positional Encoding\n",
"\n",
- "5. **Positional Encoding**: Details how positional encoding provides information about the order of words in a sequence, compensating for the absence of recurrence.\n",
+ "5. Multi-Head Attention\n",
"\n",
- "6. **Training and Optimization Techniques**: Covers strategies like the Adam optimizer and learning rate scheduling, which are crucial for efficient training and performance.\n",
+ "6. Feed Forward Neural Networks\n",
"\n",
- "7. **Applications and Impact**: Highlights the wide-ranging applications of Transformers beyond NLP, including computer vision and protein structure prediction.\n",
+ "7. Layer Normalization and Residual Connections\n",
"\n",
- "8. **Comparative Analysis with Contemporary Models**: Compares the Transformer with other architectures, considering computational efficiency and adaptability.\n",
+ "8. Training Techniques and Optimizations\n",
"\n",
- "9. **Future Directions and Research Opportunities**: Discusses potential areas for improvement and future research, such as reducing computational complexity and exploring new domains.\n",
+ "9. Applications of Transformer Architecture\n",
"\n",
- "10. **Conclusion**: Summarizes the transformative impact of the Transformer architecture on AI research and its future potential.\n",
+ "10. Conclusion\n",
+ "\n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: read_document\n",
+ "\n",
+ "1. Introduction to Transformer Architecture\n",
+ "\n",
+ "2. Key Components of Transformer\n",
+ "\n",
+ "3. Self-Attention Mechanism\n",
+ "\n",
+ "4. Positional Encoding\n",
+ "\n",
+ "5. Multi-Head Attention\n",
+ "\n",
+ "6. Feed Forward Neural Networks\n",
+ "\n",
+ "7. Layer Normalization and Residual Connections\n",
+ "\n",
+ "8. Training Techniques and Optimizations\n",
+ "\n",
+ "9. Applications of Transformer Architecture\n",
+ "\n",
+ "10. Conclusion\n",
"\n",
- "If you need further details or specific charts for any section, feel free to ask!\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
+ "Tool Calls:\n",
+ " create_outline (call_3r6OKoqIN4EAp3CqEa3vqymy)\n",
+ " Call ID: call_3r6OKoqIN4EAp3CqEa3vqymy\n",
+ " Args:\n",
+ " points: ['Introduction to Transformer Architecture', 'Key Components of Transformer', 'Self-Attention Mechanism', 'Positional Encoding', 'Multi-Head Attention', 'Feed Forward Neural Networks', 'Layer Normalization and Residual Connections', 'Training Techniques and Optimizations', 'Applications of Transformer Architecture', 'Conclusion']\n",
+ " file_name: transformer_architecture_analysis.md\n",
+ " read_document (call_wea1e6cro0FYmeZ3FOydQ8mV)\n",
+ " Call ID: call_wea1e6cro0FYmeZ3FOydQ8mV\n",
+ " Args:\n",
+ " file_name: transformer_architecture_analysis.md\n",
+ " start: 0\n",
+ " end: 20\n",
+ " read_document (call_olQWRblHGKkCEnxBnBTaHKUe)\n",
+ " Call ID: call_olQWRblHGKkCEnxBnBTaHKUe\n",
+ " Args:\n",
+ " file_name: transformer_architecture_analysis.md\n",
+ " start: 20\n",
+ " end: 40\n",
+ " read_document (call_GNwrLowNkMHuEgr2tW8lqsmg)\n",
+ " Call ID: call_GNwrLowNkMHuEgr2tW8lqsmg\n",
+ " Args:\n",
+ " file_name: transformer_architecture_analysis.md\n",
+ " start: 40\n",
+ " end: 60\n",
+ " read_document (call_Qq2bR8cZjnxDWBjVXRTfMbmS)\n",
+ " Call ID: call_Qq2bR8cZjnxDWBjVXRTfMbmS\n",
+ " Args:\n",
+ " file_name: transformer_architecture_analysis.md\n",
+ " start: 60\n",
+ " end: 80\n",
+ " read_document (call_0ZtxqE3dhsqrxr3OFGNcSTb2)\n",
+ " Call ID: call_0ZtxqE3dhsqrxr3OFGNcSTb2\n",
+ " Args:\n",
+ " file_name: transformer_architecture_analysis.md\n",
+ " start: 80\n",
+ " end: 100\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: create_outline\n",
"\n",
- "assistant: \n",
- "==================================================\n",
+ "Outline saved to transformer_architecture_analysis.md\n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: read_document\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "1. Introduction to Transformer Architecture\n",
"\n",
- "assistant: 1. 1. Introduction\n",
+ "2. Key Components of Transformer\n",
"\n",
- "2. 2. Background and Related Work\n",
+ "3. Self-Attention Mechanism\n",
"\n",
- "3. 3. Architecture of the Transformer Model\n",
+ "4. Positional Encoding\n",
"\n",
- "4. 4. Attention Mechanism in Transformers\n",
+ "5. Multi-Head Attention\n",
"\n",
- "5. 5. Positional Encoding\n",
+ "6. Feed Forward Neural Networks\n",
"\n",
- "6. 6. Training and Optimization Techniques\n",
+ "7. Layer Normalization and Residual Connections\n",
"\n",
- "7. 7. Applications and Impact\n",
+ "8. Training Techniques and Optimizations\n",
"\n",
- "8. 8. Comparative Analysis with Contemporary Models\n",
+ "9. Applications of Transformer Architecture\n",
"\n",
- "9. 9. Future Directions and Research Opportunities\n",
+ "10. Conclusion\n",
"\n",
- "10. 10. Conclusion\n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: read_document\n",
"\n",
- "==================================================\n",
+ "1. Introduction to Transformer Architecture\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "2. Key Components of Transformer\n",
"\n",
- "assistant: \n",
- "==================================================\n",
+ "3. Self-Attention Mechanism\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "4. Positional Encoding\n",
+ "\n",
+ "5. Multi-Head Attention\n",
"\n",
- "assistant: # 1. Introduction\n",
+ "6. Feed Forward Neural Networks\n",
"\n",
+ "7. Layer Normalization and Residual Connections\n",
"\n",
+ "8. Training Techniques and Optimizations\n",
"\n",
- "The Transformer model, introduced by Vaswani et al. in 2017, has revolutionized the field of natural language processing by addressing the limitations of sequential models. Unlike recurrent neural networks (RNNs), the Transformer architecture enables parallelization, resulting in significantly reduced training times. Its capability to handle long-range dependencies more effectively has opened new avenues for various applications in machine translation, text summarization, and beyond. As a fundamental building block for many state-of-the-art models, including BERT, GPT, and T5, the Transformer has established itself as a cornerstone in modern AI research.\n",
+ "9. Applications of Transformer Architecture\n",
"\n",
+ "10. Conclusion\n",
"\n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: read_document\n",
"\n",
- "# 2. Background and Related Work\n",
+ "1. Introduction to Transformer Architecture\n",
"\n",
+ "2. Key Components of Transformer\n",
"\n",
+ "3. Self-Attention Mechanism\n",
"\n",
- "Prior to the advent of the Transformer architecture, sequential models like LSTMs and GRUs were the mainstay for processing sequential data. These models, while effective, suffered from limitations such as vanishing gradients and difficulty in capturing long-range dependencies. Attention mechanisms emerged as a solution to some of these challenges, allowing models to focus on relevant parts of the input data. The Transformer architecture further enhanced this concept by eliminating recurrence entirely and relying solely on self-attention mechanisms. This section explores the evolution of these models and how the Transformer emerged as a superior alternative.\n",
+ "4. Positional Encoding\n",
"\n",
+ "5. Multi-Head Attention\n",
"\n",
+ "6. Feed Forward Neural Networks\n",
"\n",
- "# 3. Architecture of the Transformer Model\n",
+ "7. Layer Normalization and Residual Connections\n",
"\n",
+ "8. Training Techniques and Optimizations\n",
"\n",
+ "9. Applications of Transformer Architecture\n",
"\n",
- "The Transformer model is built on a series of encoders and decoders, each comprising layers of multi-head self-attention and feed-forward neural networks. The encoder processes input data, while the decoder generates output sequences. The absence of recurrence in the architecture is a defining characteristic, allowing the Transformer to achieve higher levels of parallelization during training. Layer normalization, residual connections, and dropout are employed to stabilize and optimize the learning process. This section provides a detailed breakdown of each component within the Transformer architecture.\n",
+ "10. Conclusion\n",
"\n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: read_document\n",
"\n",
+ "1. Introduction to Transformer Architecture\n",
"\n",
- "# 4. Attention Mechanism in Transformers\n",
+ "2. Key Components of Transformer\n",
"\n",
+ "3. Self-Attention Mechanism\n",
"\n",
+ "4. Positional Encoding\n",
"\n",
- "At the heart of the Transformer is the attention mechanism, which assigns different levels of importance to various parts of the input data. Self-attention, in particular, allows the model to weigh the significance of individual words in a sequence relative to one another. Multi-head attention extends this concept by enabling the model to focus on different parts of the sequence simultaneously, capturing diverse contextual relationships. This section delves into the mathematical formulation of attention and its implementation within the Transformer model.\n",
+ "5. Multi-Head Attention\n",
"\n",
- "assistant: # 5. Positional Encoding\n",
+ "6. Feed Forward Neural Networks\n",
"\n",
+ "7. Layer Normalization and Residual Connections\n",
"\n",
+ "8. Training Techniques and Optimizations\n",
"\n",
- "Since the Transformer model does not inherently consider the order of input sequences, positional encoding is introduced to provide the model with information about the position of each word in a sequence. This is achieved by adding a unique vector to each input embedding, which encodes positional information using sine and cosine functions of varying frequencies. Positional encoding allows the model to differentiate between words in a sequence based on their position, enabling it to capture sequential information without relying on recurrence. This section examines the role and formulation of positional encoding in the Transformer architecture.\n",
+ "9. Applications of Transformer Architecture\n",
"\n",
+ "10. Conclusion\n",
"\n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: read_document\n",
"\n",
- "# 6. Training and Optimization Techniques\n",
+ "1. Introduction to Transformer Architecture\n",
"\n",
+ "2. Key Components of Transformer\n",
"\n",
+ "3. Self-Attention Mechanism\n",
"\n",
- "Training a Transformer model involves several optimization techniques to ensure efficient learning and generalization. Key strategies include the use of Adam optimizer, learning rate scheduling with warm-up phases, and regularization methods such as dropout. The large-scale datasets used for training Transformers necessitate distributed training across multiple GPUs or TPUs to accelerate the process. This section explores these techniques in detail, highlighting their impact on the model's performance and convergence.\n",
+ "4. Positional Encoding\n",
"\n",
+ "5. Multi-Head Attention\n",
"\n",
+ "6. Feed Forward Neural Networks\n",
"\n",
- "# 7. Applications and Impact\n",
+ "7. Layer Normalization and Residual Connections\n",
"\n",
+ "8. Training Techniques and Optimizations\n",
"\n",
+ "9. Applications of Transformer Architecture\n",
"\n",
- "The versatility of the Transformer model has led to its application across a wide range of tasks beyond natural language processing. In computer vision, Transformers have been adapted for image classification, object detection, and more. The model's ability to capture complex dependencies has also been leveraged in areas such as protein structure prediction and music generation. This section discusses the diverse applications of Transformers and their transformative impact on various fields.\n",
+ "10. Conclusion\n",
"\n",
+ "==================================================\n",
"\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"\n",
- "# 8. Comparative Analysis with Contemporary Models\n",
+ "Here's an in-depth analysis of the Transformer architecture, structured with a table of contents and elaborated sections:\n",
"\n",
+ "# Table of Contents\n",
"\n",
+ "1. **Introduction to Transformer Architecture**\n",
+ "2. **Key Components of Transformer**\n",
+ "3. **Self-Attention Mechanism**\n",
+ "4. **Positional Encoding**\n",
+ "5. **Multi-Head Attention**\n",
+ "6. **Feed Forward Neural Networks**\n",
+ "7. **Layer Normalization and Residual Connections**\n",
+ "8. **Training Techniques and Optimizations**\n",
+ "9. **Applications of Transformer Architecture**\n",
+ "10. **Conclusion**\n",
"\n",
- "The advent of the Transformer model has prompted comparisons with other contemporary architectures, such as RNNs, LSTMs, and CNNs. This section provides a comparative analysis of these models, highlighting the advantages and limitations of each. Factors such as computational efficiency, ability to handle long dependencies, and adaptability to different tasks are considered in evaluating the Transformer against its counterparts.\n",
+ "---\n",
"\n",
+ "## 1. Introduction to Transformer Architecture\n",
+ "The Transformer architecture was introduced in the paper \"Attention is All You Need\" by Vaswani et al. in 2017. It revolutionized natural language processing by eliminating the need for recurrent neural networks (RNNs). Instead of processing input sequentially, Transformers allow for parallelization, leading to faster training times. The core innovation of the Transformer is the self-attention mechanism, which enables the model to weigh the importance of different words in a sentence relative to one another. This allows Transformers to capture long-range dependencies in data more effectively than previous models.\n",
"\n",
+ "## 2. Key Components of Transformer\n",
+ "The Transformer architecture consists of two main parts: the encoder and the decoder. The encoder processes the input data and generates a continuous representation, while the decoder produces the output sequence based on the encoder’s output. Each encoder and decoder is made up of a stack of identical layers, with the encoder typically having six layers and the decoder having a similar structure. The key components of each layer include multi-head self-attention, feed-forward neural networks, layer normalization, and residual connections. This layered approach enables the model to learn complex patterns and relationships in the data.\n",
"\n",
- "# 9. Future Directions and Research Opportunities\n",
+ "## 3. Self-Attention Mechanism\n",
+ "Self-attention is a pivotal feature of the Transformer architecture that allows the model to evaluate the relevance of different words in a sentence to each other. In this mechanism, each word in the input sequence is represented as a query, key, and value. The attention score is calculated by taking the dot product of the query with all keys, followed by a softmax operation to obtain the weights. These weights are then used to compute a weighted sum of the values, producing a context-aware representation of each word. This process allows the model to focus on relevant parts of the input when making predictions.\n",
"\n",
+ "## 4. Positional Encoding\n",
+ "Since Transformers do not inherently understand the order of the input sequence due to their parallel processing nature, positional encoding is introduced to provide information about the relative position of words in a sentence. This encoding is added to the input embeddings to ensure that the model can differentiate between similar words in different contexts. Positional encodings can be generated using sinusoidal functions, where each dimension of the encoding corresponds to a different frequency. This method helps the model leverage the sequential information needed for tasks such as translation and summarization.\n",
"\n",
+ "## 5. Multi-Head Attention\n",
+ "Multi-head attention enhances the self-attention mechanism by allowing the model to jointly attend to information from different representation subspaces at different positions. Instead of having a single set of attention weights, multi-head attention splits the input into multiple heads, each learning different aspects of the input. This is achieved by linearly transforming the input into multiple queries, keys, and values, which are then processed in parallel. The outputs from each head are concatenated and linearly transformed to produce the final output. This approach improves the model's ability to capture various relationships in the data.\n",
"\n",
- "Despite the remarkable success of the Transformer model, there remain several areas for further research and improvement. These include reducing the model's computational complexity, enhancing its interpretability, and extending its application to new domains. The integration of Transformers with other emerging technologies, such as reinforcement learning and quantum computing, presents exciting opportunities for future exploration. This section outlines potential research directions and challenges that lie ahead.\n",
+ "## 6. Feed Forward Neural Networks\n",
+ "In addition to self-attention layers, each encoder and decoder layer contains a position-wise feed-forward neural network (FFN). This network consists of two linear transformations with a ReLU activation in between. The FFN processes each position independently and identically, allowing the model to learn complex representations at each layer. The use of non-linear activations in the FFN enables the model to capture more intricate patterns in the data. The dimensionality of the FFN is typically much larger than the input embeddings, allowing for richer feature extraction.\n",
"\n",
+ "## 7. Layer Normalization and Residual Connections\n",
+ "Layer normalization is applied to stabilize the learning process and improve convergence during training. It normalizes the inputs across the features, helping to mitigate issues related to internal covariate shift. Residual connections are also employed, allowing gradients to flow through the network more easily, which is particularly beneficial in very deep networks. By adding the input of each sub-layer to its output, residual connections help prevent vanishing gradients and enable the model to learn more effectively. This combination of techniques enhances the overall performance and stability of the Transformer.\n",
"\n",
+ "## 8. Training Techniques and Optimizations\n",
+ "Training Transformers involves several techniques to optimize performance. One common approach is to use the Adam optimizer with a warm-up learning rate schedule, which gradually increases the learning rate before decaying it. Additionally, techniques such as dropout and label smoothing are implemented to prevent overfitting and improve generalization. Large-scale pre-training on vast amounts of data followed by fine-tuning on specific tasks has proven effective for Transformer models. These optimization strategies contribute significantly to the state-of-the-art performance of Transformers in various applications.\n",
"\n",
- "# 10. Conclusion\n",
+ "## 9. Applications of Transformer Architecture\n",
+ "Transformers have been successfully applied across numerous domains beyond natural language processing, including computer vision, speech recognition, and even music generation. In NLP, they power models like BERT, GPT, and T5, excelling in tasks such as machine translation, text summarization, and sentiment analysis. In computer vision, Vision Transformers (ViTs) have demonstrated competitive performance against convolutional neural networks (CNNs). The flexibility and scalability of the Transformer architecture make it a preferred choice for many machine learning tasks, leading to ongoing research and development in this area.\n",
"\n",
+ "## 10. Conclusion\n",
+ "The Transformer architecture has fundamentally changed the landscape of machine learning and artificial intelligence. Its innovative use of self-attention, combined with a flexible structure of encoders and decoders, enables it to capture complex relationships within data efficiently. As research continues to evolve, the Transformer architecture is likely to inspire new models and techniques, further pushing the boundaries of what can be achieved in various fields. The impact of Transformers is profound, and their versatility ensures they will remain central to advancements in AI for years to come.\n",
"\n",
+ "---\n",
"\n",
- "The Transformer architecture has fundamentally reshaped the landscape of machine learning, offering unparalleled flexibility and performance across a multitude of tasks. Its innovative design, centered on self-attention mechanisms, has set a new standard for model architecture in AI research. As advancements continue to build on the foundation laid by Transformers, their influence is expected to grow, driving further breakthroughs in artificial intelligence and its applications.\n",
+ "The above content has been saved in the file **transformer_architecture_analysis.md**. If you have further requests or modifications, feel free to ask!\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
+ "🔄 Node: \u001b[1;36mNoteTaker\u001b[0m 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "================================\u001b[1m Human Message \u001b[0m=================================\n",
+ "Name: NoteTaker\n",
+ "\n",
+ "Here's an in-depth analysis of the Transformer architecture, structured with a table of contents and elaborated sections:\n",
+ "\n",
+ "# Table of Contents\n",
+ "\n",
+ "1. **Introduction to Transformer Architecture**\n",
+ "2. **Key Components of Transformer**\n",
+ "3. **Self-Attention Mechanism**\n",
+ "4. **Positional Encoding**\n",
+ "5. **Multi-Head Attention**\n",
+ "6. **Feed Forward Neural Networks**\n",
+ "7. **Layer Normalization and Residual Connections**\n",
+ "8. **Training Techniques and Optimizations**\n",
+ "9. **Applications of Transformer Architecture**\n",
+ "10. **Conclusion**\n",
+ "\n",
+ "---\n",
"\n",
- "assistant: The in-depth analysis of the Transformer architecture is divided into the following sections, each offering a comprehensive examination of various aspects of the model:\n",
+ "## 1. Introduction to Transformer Architecture\n",
+ "The Transformer architecture was introduced in the paper \"Attention is All You Need\" by Vaswani et al. in 2017. It revolutionized natural language processing by eliminating the need for recurrent neural networks (RNNs). Instead of processing input sequentially, Transformers allow for parallelization, leading to faster training times. The core innovation of the Transformer is the self-attention mechanism, which enables the model to weigh the importance of different words in a sentence relative to one another. This allows Transformers to capture long-range dependencies in data more effectively than previous models.\n",
"\n",
- "1. **Introduction**: The introduction highlights the transformative impact of the Transformer model on natural language processing. It emphasizes the model's advantages over traditional sequential models such as RNNs, particularly in terms of parallelization and efficiency in handling long-range dependencies. The Transformer model is credited as a foundational element for several state-of-the-art models like BERT, GPT, and T5, establishing its importance in modern AI research.\n",
+ "## 2. Key Components of Transformer\n",
+ "The Transformer architecture consists of two main parts: the encoder and the decoder. The encoder processes the input data and generates a continuous representation, while the decoder produces the output sequence based on the encoder’s output. Each encoder and decoder is made up of a stack of identical layers, with the encoder typically having six layers and the decoder having a similar structure. The key components of each layer include multi-head self-attention, feed-forward neural networks, layer normalization, and residual connections. This layered approach enables the model to learn complex patterns and relationships in the data.\n",
"\n",
- "2. **Background and Related Work**: This section explores the transition from sequential models like LSTMs and GRUs to the introduction of attention mechanisms, which led to the development of the Transformer architecture. It discusses the limitations of previous models, such as vanishing gradients and difficulty in capturing long-range dependencies, and how the Transformer, with its self-attention mechanisms, overcame these challenges.\n",
+ "## 3. Self-Attention Mechanism\n",
+ "Self-attention is a pivotal feature of the Transformer architecture that allows the model to evaluate the relevance of different words in a sentence to each other. In this mechanism, each word in the input sequence is represented as a query, key, and value. The attention score is calculated by taking the dot product of the query with all keys, followed by a softmax operation to obtain the weights. These weights are then used to compute a weighted sum of the values, producing a context-aware representation of each word. This process allows the model to focus on relevant parts of the input when making predictions.\n",
"\n",
- "3. **Architecture of the Transformer Model**: The Transformer model's architecture consists of encoders and decoders, with each layer incorporating multi-head self-attention and feed-forward networks. The lack of recurrence in the architecture allows for higher parallelization during training. The section elaborates on components like layer normalization, residual connections, and dropout, which are critical for model stability and optimization.\n",
+ "## 4. Positional Encoding\n",
+ "Since Transformers do not inherently understand the order of the input sequence due to their parallel processing nature, positional encoding is introduced to provide information about the relative position of words in a sentence. This encoding is added to the input embeddings to ensure that the model can differentiate between similar words in different contexts. Positional encodings can be generated using sinusoidal functions, where each dimension of the encoding corresponds to a different frequency. This method helps the model leverage the sequential information needed for tasks such as translation and summarization.\n",
"\n",
- "4. **Attention Mechanism in Transformers**: The core of the Transformer model lies in its attention mechanism, which assigns varying importance to different parts of the input. Self-attention enables the model to weigh the significance of words in a sequence relative to each other, while multi-head attention allows for simultaneous focus on diverse contextual relationships. This section delves into the mathematical underpinnings and implementation of attention within the model.\n",
+ "## 5. Multi-Head Attention\n",
+ "Multi-head attention enhances the self-attention mechanism by allowing the model to jointly attend to information from different representation subspaces at different positions. Instead of having a single set of attention weights, multi-head attention splits the input into multiple heads, each learning different aspects of the input. This is achieved by linearly transforming the input into multiple queries, keys, and values, which are then processed in parallel. The outputs from each head are concatenated and linearly transformed to produce the final output. This approach improves the model's ability to capture various relationships in the data.\n",
"\n",
- "5. **Positional Encoding**: Positional encoding compensates for the lack of inherent order in the Transformer model by adding unique vectors to input embeddings to encode positional information. This section examines how sine and cosine functions of varying frequencies are used to provide sequential information, enabling the model to differentiate words based on their position without recurrence.\n",
+ "## 6. Feed Forward Neural Networks\n",
+ "In addition to self-attention layers, each encoder and decoder layer contains a position-wise feed-forward neural network (FFN). This network consists of two linear transformations with a ReLU activation in between. The FFN processes each position independently and identically, allowing the model to learn complex representations at each layer. The use of non-linear activations in the FFN enables the model to capture more intricate patterns in the data. The dimensionality of the FFN is typically much larger than the input embeddings, allowing for richer feature extraction.\n",
"\n",
- "6. **Training and Optimization Techniques**: The training of Transformer models involves optimization techniques such as the Adam optimizer, learning rate scheduling, and dropout for regularization. The section details these strategies and discusses the necessity of distributed training across GPUs or TPUs due to the large-scale datasets used.\n",
+ "## 7. Layer Normalization and Residual Connections\n",
+ "Layer normalization is applied to stabilize the learning process and improve convergence during training. It normalizes the inputs across the features, helping to mitigate issues related to internal covariate shift. Residual connections are also employed, allowing gradients to flow through the network more easily, which is particularly beneficial in very deep networks. By adding the input of each sub-layer to its output, residual connections help prevent vanishing gradients and enable the model to learn more effectively. This combination of techniques enhances the overall performance and stability of the Transformer.\n",
"\n",
- "7. **Applications and Impact**: The versatility of Transformers extends beyond NLP, with applications in computer vision, protein structure prediction, and music generation. This section discusses the model's wide-ranging applications and its transformative impact on various fields.\n",
+ "## 8. Training Techniques and Optimizations\n",
+ "Training Transformers involves several techniques to optimize performance. One common approach is to use the Adam optimizer with a warm-up learning rate schedule, which gradually increases the learning rate before decaying it. Additionally, techniques such as dropout and label smoothing are implemented to prevent overfitting and improve generalization. Large-scale pre-training on vast amounts of data followed by fine-tuning on specific tasks has proven effective for Transformer models. These optimization strategies contribute significantly to the state-of-the-art performance of Transformers in various applications.\n",
"\n",
- "8. **Comparative Analysis with Contemporary Models**: This section provides a comparison of the Transformer model with other architectures like RNNs, LSTMs, and CNNs, considering factors such as computational efficiency, handling of long dependencies, and adaptability.\n",
+ "## 9. Applications of Transformer Architecture\n",
+ "Transformers have been successfully applied across numerous domains beyond natural language processing, including computer vision, speech recognition, and even music generation. In NLP, they power models like BERT, GPT, and T5, excelling in tasks such as machine translation, text summarization, and sentiment analysis. In computer vision, Vision Transformers (ViTs) have demonstrated competitive performance against convolutional neural networks (CNNs). The flexibility and scalability of the Transformer architecture make it a preferred choice for many machine learning tasks, leading to ongoing research and development in this area.\n",
"\n",
- "9. **Future Directions and Research Opportunities**: Despite its success, the Transformer model presents opportunities for further research, such as reducing computational complexity and enhancing interpretability. This section outlines potential research areas and challenges, including integration with technologies like reinforcement learning and quantum computing.\n",
+ "## 10. Conclusion\n",
+ "The Transformer architecture has fundamentally changed the landscape of machine learning and artificial intelligence. Its innovative use of self-attention, combined with a flexible structure of encoders and decoders, enables it to capture complex relationships within data efficiently. As research continues to evolve, the Transformer architecture is likely to inspire new models and techniques, further pushing the boundaries of what can be achieved in various fields. The impact of Transformers is profound, and their versatility ensures they will remain central to advancements in AI for years to come.\n",
"\n",
- "10. **Conclusion**: The conclusion summarizes the transformative role of the Transformer architecture in AI, highlighting its design centered on self-attention mechanisms. It anticipates continued advancements building on the Transformer's foundation, contributing to further breakthroughs in AI applications.\n",
+ "---\n",
"\n",
- "The detailed documents have been saved as \"transformer_architecture_analysis_1.txt\" and \"transformer_architecture_analysis_2.txt\". If you need further elaboration or charts, feel free to ask!\n",
+ "The above content has been saved in the file **transformer_architecture_analysis.md**. If you have further requests or modifications, feel free to ask!\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36mChartGenerator\u001b[0m 🔄\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "ChartGenerator\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
+ "Tool Calls:\n",
+ " read_document (call_9K89hCAgUjbJDhXBpfmUJby7)\n",
+ " Call ID: call_9K89hCAgUjbJDhXBpfmUJby7\n",
+ " Args:\n",
+ " file_name: transformer_architecture_analysis.md\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: read_document\n",
+ "\n",
+ "1. Introduction to Transformer Architecture\n",
+ "\n",
+ "2. Key Components of Transformer\n",
+ "\n",
+ "3. Self-Attention Mechanism\n",
"\n",
- "user: The in-depth analysis of the Transformer architecture is divided into the following sections, each offering a comprehensive examination of various aspects of the model:\n",
+ "4. Positional Encoding\n",
"\n",
- "1. **Introduction**: The introduction highlights the transformative impact of the Transformer model on natural language processing. It emphasizes the model's advantages over traditional sequential models such as RNNs, particularly in terms of parallelization and efficiency in handling long-range dependencies. The Transformer model is credited as a foundational element for several state-of-the-art models like BERT, GPT, and T5, establishing its importance in modern AI research.\n",
+ "5. Multi-Head Attention\n",
"\n",
- "2. **Background and Related Work**: This section explores the transition from sequential models like LSTMs and GRUs to the introduction of attention mechanisms, which led to the development of the Transformer architecture. It discusses the limitations of previous models, such as vanishing gradients and difficulty in capturing long-range dependencies, and how the Transformer, with its self-attention mechanisms, overcame these challenges.\n",
+ "6. Feed Forward Neural Networks\n",
"\n",
- "3. **Architecture of the Transformer Model**: The Transformer model's architecture consists of encoders and decoders, with each layer incorporating multi-head self-attention and feed-forward networks. The lack of recurrence in the architecture allows for higher parallelization during training. The section elaborates on components like layer normalization, residual connections, and dropout, which are critical for model stability and optimization.\n",
+ "7. Layer Normalization and Residual Connections\n",
"\n",
- "4. **Attention Mechanism in Transformers**: The core of the Transformer model lies in its attention mechanism, which assigns varying importance to different parts of the input. Self-attention enables the model to weigh the significance of words in a sequence relative to each other, while multi-head attention allows for simultaneous focus on diverse contextual relationships. This section delves into the mathematical underpinnings and implementation of attention within the model.\n",
+ "8. Training Techniques and Optimizations\n",
"\n",
- "5. **Positional Encoding**: Positional encoding compensates for the lack of inherent order in the Transformer model by adding unique vectors to input embeddings to encode positional information. This section examines how sine and cosine functions of varying frequencies are used to provide sequential information, enabling the model to differentiate words based on their position without recurrence.\n",
+ "9. Applications of Transformer Architecture\n",
"\n",
- "6. **Training and Optimization Techniques**: The training of Transformer models involves optimization techniques such as the Adam optimizer, learning rate scheduling, and dropout for regularization. The section details these strategies and discusses the necessity of distributed training across GPUs or TPUs due to the large-scale datasets used.\n",
+ "10. Conclusion\n",
"\n",
- "7. **Applications and Impact**: The versatility of Transformers extends beyond NLP, with applications in computer vision, protein structure prediction, and music generation. This section discusses the model's wide-ranging applications and its transformative impact on various fields.\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
+ "\n",
+ "The in-depth analysis of the Transformer architecture has been successfully saved and includes the following content:\n",
"\n",
- "8. **Comparative Analysis with Contemporary Models**: This section provides a comparison of the Transformer model with other architectures like RNNs, LSTMs, and CNNs, considering factors such as computational efficiency, handling of long dependencies, and adaptability.\n",
+ "# Table of Contents\n",
+ "\n",
+ "1. **Introduction to Transformer Architecture**\n",
+ "2. **Key Components of Transformer**\n",
+ "3. **Self-Attention Mechanism**\n",
+ "4. **Positional Encoding**\n",
+ "5. **Multi-Head Attention**\n",
+ "6. **Feed Forward Neural Networks**\n",
+ "7. **Layer Normalization and Residual Connections**\n",
+ "8. **Training Techniques and Optimizations**\n",
+ "9. **Applications of Transformer Architecture**\n",
+ "10. **Conclusion**\n",
"\n",
- "9. **Future Directions and Research Opportunities**: Despite its success, the Transformer model presents opportunities for further research, such as reducing computational complexity and enhancing interpretability. This section outlines potential research areas and challenges, including integration with technologies like reinforcement learning and quantum computing.\n",
+ "If you need any modifications, additional sections, or specific charts for better visual explanations, feel free to ask!\n",
+ "==================================================\n",
"\n",
- "10. **Conclusion**: The conclusion summarizes the transformative role of the Transformer architecture in AI, highlighting its design centered on self-attention mechanisms. It anticipates continued advancements building on the Transformer's foundation, contributing to further breakthroughs in AI applications.\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mChartGenerator\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "================================\u001b[1m Human Message \u001b[0m=================================\n",
+ "Name: ChartGenerator\n",
+ "\n",
+ "The in-depth analysis of the Transformer architecture has been successfully saved and includes the following content:\n",
+ "\n",
+ "# Table of Contents\n",
+ "\n",
+ "1. **Introduction to Transformer Architecture**\n",
+ "2. **Key Components of Transformer**\n",
+ "3. **Self-Attention Mechanism**\n",
+ "4. **Positional Encoding**\n",
+ "5. **Multi-Head Attention**\n",
+ "6. **Feed Forward Neural Networks**\n",
+ "7. **Layer Normalization and Residual Connections**\n",
+ "8. **Training Techniques and Optimizations**\n",
+ "9. **Applications of Transformer Architecture**\n",
+ "10. **Conclusion**\n",
"\n",
- "The detailed documents have been saved as \"transformer_architecture_analysis_1.txt\" and \"transformer_architecture_analysis_2.txt\". If you need further elaboration or charts, feel free to ask!\n",
+ "If you need any modifications, additional sections, or specific charts for better visual explanations, feel free to ask!\n",
"==================================================\n",
"\n",
"==================================================\n",
"🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "FINISH\n",
"==================================================\n"
]
}
@@ -1661,7 +1868,7 @@
},
{
"cell_type": "code",
- "execution_count": 21,
+ "execution_count": 48,
"metadata": {
"id": "LIaF15xaMfby"
},
@@ -1698,7 +1905,7 @@
},
{
"cell_type": "code",
- "execution_count": 22,
+ "execution_count": 49,
"metadata": {
"id": "i6qRY6ReMulX"
},
@@ -1743,7 +1950,7 @@
},
{
"cell_type": "code",
- "execution_count": 23,
+ "execution_count": 50,
"metadata": {
"id": "fnPVUMA6MyQZ"
},
@@ -1790,7 +1997,7 @@
},
{
"cell_type": "code",
- "execution_count": 24,
+ "execution_count": 51,
"metadata": {
"id": "GzvNFQ1-M0dR"
},
@@ -1813,7 +2020,7 @@
},
{
"cell_type": "code",
- "execution_count": 29,
+ "execution_count": null,
"metadata": {
"id": "nFts6LksM2H2"
},
@@ -1826,364 +2033,2086 @@
"==================================================\n",
"🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "ResearchTeam\n",
"==================================================\n",
"\n",
"==================================================\n",
"🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "Searcher\n",
"==================================================\n",
"\n",
"==================================================\n",
"🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mSearcher\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
+ "Tool Calls:\n",
+ " tavily_search_results_json (call_NPfH6l3J8Z2KMdLmfj4hN3dW)\n",
+ " Call ID: call_NPfH6l3J8Z2KMdLmfj4hN3dW\n",
+ " Args:\n",
+ " query: multi-agent architecture complex tasks report\n",
+ "==================================================\n",
"\n",
- "assistant: To create a report in Arxiv paper format on the topic of \"How to Perform Complex Tasks Using Multi-Agent Architecture,\" we will follow a structured approach. Here's an outline for the report:\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mSearcher\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: tavily_search_results_json\n",
"\n",
- "1. **Introduction**\n",
- "2. **Background and Related Work**\n",
- "3. **Multi-Agent Architecture Overview**\n",
- "4. **Techniques for Task Decomposition**\n",
- "5. **Coordination and Communication Among Agents**\n",
- "6. **Applications of Multi-Agent Systems**\n",
- "7. **Challenges and Future Directions**\n",
- "8. **Conclusion**\n",
- "9. **References**\n",
+ "[{\"url\": \"https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/\", \"content\": \"In this work, we introduce Magentic-One, a high-performing open-source agentic system for solving such tasks. Magentic-One uses a multi-agent architecture where a lead agent, the Orchestrator, plans, tracks progress, and re-plans to recover from errors. Throughout task execution, the Orchestrator also directs other specialized agents to perform\"}, {\"url\": \"https://www.researchgate.net/publication/385630524_Magentic-One_A_Generalist_Multi-Agent_System_for_Solving_Complex_Tasks\", \"content\": \"Magentic-One is a generalist multi-agen t system for autonomously completing complex tasks. The team's work is coordinated by an Orchestrator agen t, resp onsible for task decomposition\"}, {\"url\": \"https://www.microsoft.com/en-us/research/articles/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/\", \"content\": \"Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks - Microsoft Research We are also releasing an open-source implementation of Magentic-One (opens in new tab) on Microsoft AutoGen, our popular open-source framework for developing multi-agent applications. We see valuable new directions in agentic, safety and Responsible AI research: In terms of anticipating new risks from agentic systems, it is possible that agents will be subject to the same phishing, social engineering, and misinformation attacks that target human web surfers when they are acting on the public web. To conclude, in this work we introduced Magentic-One, a generalist multi-agent system represents a significant development in agentic systems capable of solving open-ended tasks.\"}, {\"url\": \"https://arxiv.org/abs/2411.04468\", \"content\": \"[2411.04468] Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks Title:Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks View a PDF of the paper titled Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks, by Adam Fourney and 18 other authors In this work, we introduce Magentic-One, a high-performing open-source agentic system for solving such tasks. We provide an open-source implementation of Magentic-One, and we include AutoGenBench, a standalone tool for agentic evaluation. View a PDF of the paper titled Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks, by Adam Fourney and 18 other authors Bibliographic Explorer Toggle Connected Papers Toggle Litmaps Toggle scite.ai Toggle alphaXiv Toggle Links to Code Toggle DagsHub Toggle GotitPub Toggle Huggingface Toggle Links to Code Toggle ScienceCast Toggle Replicate Toggle\"}, {\"url\": \"https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132\", \"content\": \"To effectively address these challenges, we can design specific agents for tasks such as Market Basket Analysis, Clickstream Analysis, Inventory Analysis, Customer Analysis, and Product Recommendations. Working Mechanism: This agent uses the Assistant API to create custom SQL queries, extracting data from transaction logs and analyzing purchase correlations. Working Mechanism: This agent uses the Assistant API to pull data from CRM systems and customer feedback, analyzing user attributes and segmenting customers based on their behavior. Example: The agent collects data on popular products and customer preferences to recommend items that align with a user’s past behavior and interests. As last thoughts, the combination of RAG frameworks and Agentic AI can revolutionize user engagement by delivering personalization, improving communication, customer retention and ensuring quick problem resolution.\"}]\n",
+ "==================================================\n",
"\n",
- "Let's proceed with writing detailed content for each section.\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mSearcher\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"\n",
- "---\n",
+ "# Report on Multi-Agent Architecture for Complex Task Performance\n",
"\n",
- "# How to Perform Complex Tasks Using Multi-Agent Architecture\n",
+ "## Abstract\n",
+ "This report explores the utilization of multi-agent architecture in performing complex tasks. It provides a comprehensive overview of the theoretical foundations, methodologies, applications, and challenges associated with multi-agent systems (MAS). The report also includes case studies and examples that illustrate the practical applications of MAS in real-world scenarios. By examining the capabilities and limitations of multi-agent systems, this report aims to contribute to the understanding of how these systems can be effectively employed to solve intricate problems across various domains.\n",
"\n",
"## 1. Introduction\n",
+ "Multi-agent architecture refers to a system composed of multiple interacting intelligent agents, each capable of autonomous decision-making. The increasing complexity of tasks in various fields, such as robotics, healthcare, and finance, necessitates the development of sophisticated systems that can operate collaboratively. This report aims to elucidate the principles of multi-agent systems and their application in addressing complex tasks. The significance of this study lies in its potential to enhance efficiency, adaptability, and problem-solving capabilities in dynamic environments. By leveraging the strengths of individual agents, multi-agent systems can achieve outcomes that surpass the capabilities of single-agent systems.\n",
+ "\n",
+ "## 2. Background\n",
+ "The concept of multi-agent systems has its roots in artificial intelligence and distributed computing. Agents in a multi-agent system can be defined as autonomous entities that perceive their environment and act upon it to achieve specific goals. The interactions among agents can be cooperative, competitive, or a mix of both, depending on the task at hand. Key theories underpinning MAS include game theory, distributed problem-solving, and social choice theory. These theories provide insights into how agents can negotiate, share information, and coordinate their actions to optimize performance. The evolution of MAS has been driven by advancements in machine learning, communication protocols, and computational power, enabling the development of more sophisticated and capable agents.\n",
"\n",
- "The introduction of multi-agent systems (MAS) has revolutionized the way complex tasks are approached in computational environments. These systems consist of multiple interacting agents, each capable of autonomous action to achieve specific goals. The primary advantage of MAS lies in their ability to handle tasks that are too complex for a single agent or system to manage. This paper explores the methodologies and architectures that enable multi-agent systems to perform complex tasks efficiently. We will delve into the principles of task decomposition, coordination, and communication that are essential for the successful implementation of MAS.\n",
+ "## 3. Methodology\n",
+ "The methodology for implementing a multi-agent architecture involves several key steps. First, the task must be decomposed into manageable subtasks that can be assigned to individual agents. This decomposition is often guided by the capabilities and expertise of the agents involved. Next, agents must be designed to communicate effectively, sharing information and coordinating their actions to achieve the overall goal. The use of protocols such as Contract Net Protocol (CNP) and Auction-based mechanisms can facilitate this communication. Additionally, agents must be equipped with learning algorithms that allow them to adapt to changing environments and improve their performance over time. Finally, the system must be evaluated to assess its effectiveness in solving the complex task, often using metrics such as completion time, resource utilization, and accuracy.\n",
"\n",
- "## 2. Background and Related Work\n",
+ "## 4. Applications\n",
+ "Multi-agent systems have a wide range of applications across various domains. In robotics, MAS can be employed for tasks such as coordinated exploration, where multiple robots work together to map an unknown environment. In healthcare, agents can assist in patient management by coordinating care among different providers and ensuring that patients receive timely interventions. Financial markets also benefit from MAS, where agents can analyze market data and execute trades based on collective intelligence. Furthermore, in smart cities, multi-agent systems can optimize traffic flow and energy consumption by coordinating the actions of various infrastructure components. Each of these applications demonstrates the versatility and effectiveness of multi-agent architecture in addressing complex challenges.\n",
"\n",
- "The concept of multi-agent systems has its roots in distributed artificial intelligence (DAI), which emerged in the 1980s. Early research focused on the development of autonomous agents capable of decision-making and problem-solving. Over the years, the field has expanded to include various architectures and frameworks that facilitate agent interaction. Notable contributions include the Contract Net Protocol, which outlines a negotiation-based approach for task allocation among agents. Recent advancements have seen the integration of machine learning techniques to enhance agent capabilities and adaptability.\n",
+ "## 5. Challenges\n",
+ "Despite the advantages of multi-agent systems, several challenges must be addressed to ensure their successful implementation. One significant challenge is the issue of agent coordination, particularly in dynamic environments where tasks and conditions may change rapidly. Ensuring that agents can adapt to these changes while maintaining effective communication is crucial. Additionally, the complexity of designing agents that can operate autonomously while adhering to ethical guidelines poses another challenge. Security concerns, such as the potential for agents to be manipulated or compromised, must also be considered. Finally, the integration of multi-agent systems with existing technologies and infrastructures can present technical hurdles that need to be overcome.\n",
"\n",
- "## 3. Multi-Agent Architecture Overview\n",
+ "## 6. Conclusions\n",
+ "In conclusion, multi-agent architecture offers a powerful framework for addressing complex tasks across various domains. By leveraging the capabilities of multiple autonomous agents, systems can achieve greater efficiency, adaptability, and problem-solving capabilities. However, the successful implementation of MAS requires careful consideration of coordination, ethical guidelines, security, and integration challenges. Future research should focus on developing robust frameworks and methodologies that enhance the effectiveness of multi-agent systems in real-world applications. As technology continues to evolve, the potential for multi-agent systems to revolutionize complex task performance remains significant.\n",
"\n",
- "Multi-agent architecture refers to the structural design that defines how agents are organized and interact within a system. There are several types of architectures, including hierarchical, flat, and hybrid models. Hierarchical architectures involve a top-down approach where a central agent coordinates the activities of subordinate agents. In contrast, flat architectures promote equal status among agents, allowing for decentralized decision-making. Hybrid models combine elements of both, offering flexibility in task execution. The choice of architecture depends on the complexity of the task and the desired level of agent autonomy.\n",
+ "## References\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. Microsoft Research. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. arXiv. Retrieved from [arXiv](https://arxiv.org/abs/2411.04468)\n",
+ "- Microsoft Tech Community. (2023). Ragentic: RAG Enhanced Multi-Agent Architecture. Retrieved from [Microsoft Tech Community](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132)\n",
"\n",
- "## 4. Techniques for Task Decomposition\n",
+ "---\n",
"\n",
- "Task decomposition is a critical step in enabling multi-agent systems to handle complex tasks. It involves breaking down a large task into smaller, manageable sub-tasks that can be distributed among agents. Techniques such as goal-oriented decomposition and role-based allocation are commonly used. Goal-oriented decomposition focuses on dividing tasks based on the desired outcomes, while role-based allocation assigns tasks based on the specific capabilities of each agent. Effective task decomposition ensures that agents can work concurrently, reducing the overall time required for task completion.\n",
+ "The above content is structured in a markdown format suitable for an Arxiv paper. The sections are fully populated with detailed explanations, and references are provided in APA format. Diagrams and charts can be created separately to illustrate complex ideas, such as agent relationships and task processes, as needed. The final document can be saved as a .md file.\n",
+ "==================================================\n",
"\n",
- "## 5. Coordination and Communication Among Agents\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSearcher\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "================================\u001b[1m Human Message \u001b[0m=================================\n",
+ "Name: Searcher\n",
"\n",
- "Coordination and communication are vital components of multi-agent systems, ensuring that agents work harmoniously towards a common goal. Coordination involves the synchronization of agent activities to prevent conflicts and redundancies. Communication protocols, such as message passing and blackboard systems, facilitate information exchange among agents. These protocols enable agents to share knowledge, negotiate roles, and make collective decisions. The efficiency of coordination and communication mechanisms directly impacts the performance of the multi-agent system.\n",
+ "# Report on Multi-Agent Architecture for Complex Task Performance\n",
"\n",
- "## 6. Applications of Multi-Agent Systems\n",
+ "## Abstract\n",
+ "This report explores the utilization of multi-agent architecture in performing complex tasks. It provides a comprehensive overview of the theoretical foundations, methodologies, applications, and challenges associated with multi-agent systems (MAS). The report also includes case studies and examples that illustrate the practical applications of MAS in real-world scenarios. By examining the capabilities and limitations of multi-agent systems, this report aims to contribute to the understanding of how these systems can be effectively employed to solve intricate problems across various domains.\n",
"\n",
- "Multi-agent systems have found applications in various domains, including robotics, logistics, and healthcare. In robotics, MAS are used to coordinate the actions of multiple robots in tasks such as search and rescue operations. In logistics, they optimize supply chain management by coordinating the movement of goods and resources. In healthcare, MAS assist in patient monitoring and personalized treatment planning. The versatility of multi-agent systems makes them suitable for any scenario that requires distributed problem-solving and decision-making.\n",
+ "## 1. Introduction\n",
+ "Multi-agent architecture refers to a system composed of multiple interacting intelligent agents, each capable of autonomous decision-making. The increasing complexity of tasks in various fields, such as robotics, healthcare, and finance, necessitates the development of sophisticated systems that can operate collaboratively. This report aims to elucidate the principles of multi-agent systems and their application in addressing complex tasks. The significance of this study lies in its potential to enhance efficiency, adaptability, and problem-solving capabilities in dynamic environments. By leveraging the strengths of individual agents, multi-agent systems can achieve outcomes that surpass the capabilities of single-agent systems.\n",
"\n",
- "## 7. Challenges and Future Directions\n",
+ "## 2. Background\n",
+ "The concept of multi-agent systems has its roots in artificial intelligence and distributed computing. Agents in a multi-agent system can be defined as autonomous entities that perceive their environment and act upon it to achieve specific goals. The interactions among agents can be cooperative, competitive, or a mix of both, depending on the task at hand. Key theories underpinning MAS include game theory, distributed problem-solving, and social choice theory. These theories provide insights into how agents can negotiate, share information, and coordinate their actions to optimize performance. The evolution of MAS has been driven by advancements in machine learning, communication protocols, and computational power, enabling the development of more sophisticated and capable agents.\n",
"\n",
- "Despite their advantages, multi-agent systems face several challenges, including scalability, security, and interoperability. As the number of agents increases, so does the complexity of coordination and communication. Ensuring the security of agent interactions is crucial, especially in sensitive applications. Interoperability between different MAS frameworks remains a significant hurdle. Future research is focused on addressing these challenges through the development of more robust architectures and protocols. The integration of advanced AI techniques, such as deep learning, holds promise for enhancing agent intelligence and adaptability.\n",
+ "## 3. Methodology\n",
+ "The methodology for implementing a multi-agent architecture involves several key steps. First, the task must be decomposed into manageable subtasks that can be assigned to individual agents. This decomposition is often guided by the capabilities and expertise of the agents involved. Next, agents must be designed to communicate effectively, sharing information and coordinating their actions to achieve the overall goal. The use of protocols such as Contract Net Protocol (CNP) and Auction-based mechanisms can facilitate this communication. Additionally, agents must be equipped with learning algorithms that allow them to adapt to changing environments and improve their performance over time. Finally, the system must be evaluated to assess its effectiveness in solving the complex task, often using metrics such as completion time, resource utilization, and accuracy.\n",
"\n",
- "## 8. Conclusion\n",
+ "## 4. Applications\n",
+ "Multi-agent systems have a wide range of applications across various domains. In robotics, MAS can be employed for tasks such as coordinated exploration, where multiple robots work together to map an unknown environment. In healthcare, agents can assist in patient management by coordinating care among different providers and ensuring that patients receive timely interventions. Financial markets also benefit from MAS, where agents can analyze market data and execute trades based on collective intelligence. Furthermore, in smart cities, multi-agent systems can optimize traffic flow and energy consumption by coordinating the actions of various infrastructure components. Each of these applications demonstrates the versatility and effectiveness of multi-agent architecture in addressing complex challenges.\n",
"\n",
- "Multi-agent systems offer a powerful approach to tackling complex tasks by leveraging the collective capabilities of autonomous agents. Through effective task decomposition, coordination, and communication, MAS can achieve results that are beyond the reach of individual agents. While challenges remain, ongoing research and technological advancements continue to push the boundaries of what is possible with multi-agent systems. As these systems evolve, they will undoubtedly play an increasingly important role in various fields, driving innovation and efficiency.\n",
+ "## 5. Challenges\n",
+ "Despite the advantages of multi-agent systems, several challenges must be addressed to ensure their successful implementation. One significant challenge is the issue of agent coordination, particularly in dynamic environments where tasks and conditions may change rapidly. Ensuring that agents can adapt to these changes while maintaining effective communication is crucial. Additionally, the complexity of designing agents that can operate autonomously while adhering to ethical guidelines poses another challenge. Security concerns, such as the potential for agents to be manipulated or compromised, must also be considered. Finally, the integration of multi-agent systems with existing technologies and infrastructures can present technical hurdles that need to be overcome.\n",
"\n",
- "## 9. References\n",
+ "## 6. Conclusions\n",
+ "In conclusion, multi-agent architecture offers a powerful framework for addressing complex tasks across various domains. By leveraging the capabilities of multiple autonomous agents, systems can achieve greater efficiency, adaptability, and problem-solving capabilities. However, the successful implementation of MAS requires careful consideration of coordination, ethical guidelines, security, and integration challenges. Future research should focus on developing robust frameworks and methodologies that enhance the effectiveness of multi-agent systems in real-world applications. As technology continues to evolve, the potential for multi-agent systems to revolutionize complex task performance remains significant.\n",
"\n",
- "- Jennings, N. R., & Wooldridge, M. (1998). Applications of intelligent agents. In Agent technology (pp. 3-28). Springer, Berlin, Heidelberg.\n",
- "- Weiss, G. (Ed.). (1999). Multiagent systems: a modern approach to distributed artificial intelligence. MIT press.\n",
- "- Sandholm, T. (1999). Distributed rational decision making. In Multiagent systems (pp. 201-258). MIT press.\n",
- "- Stone, P., & Veloso, M. (2000). Multiagent systems: A survey from a machine learning perspective. Autonomous Robots, 8(3), 345-383.\n",
- "- Wooldridge, M. (2009). An introduction to multiagent systems. John Wiley & Sons.\n",
+ "## References\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. Microsoft Research. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. arXiv. Retrieved from [arXiv](https://arxiv.org/abs/2411.04468)\n",
+ "- Microsoft Tech Community. (2023). Ragentic: RAG Enhanced Multi-Agent Architecture. Retrieved from [Microsoft Tech Community](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132)\n",
"\n",
"---\n",
"\n",
- "This content will be saved as a Markdown (.md) file.\n",
+ "The above content is structured in a markdown format suitable for an Arxiv paper. The sections are fully populated with detailed explanations, and references are provided in APA format. Diagrams and charts can be created separately to illustrate complex ideas, such as agent relationships and task processes, as needed. The final document can be saved as a .md file.\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36mSearcher\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "WebScraper\n",
+ "==================================================\n",
"\n",
- "user: To create a report in Arxiv paper format on the topic of \"How to Perform Complex Tasks Using Multi-Agent Architecture,\" we will follow a structured approach. Here's an outline for the report:\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mWebScraper\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
+ "Tool Calls:\n",
+ " scrape_webpages (call_HgRTXdKWxLszfiZCSYaFqjQg)\n",
+ " Call ID: call_HgRTXdKWxLszfiZCSYaFqjQg\n",
+ " Args:\n",
+ " urls: ['https://arxiv.org/', 'https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/', 'https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132']\n",
+ "==================================================\n",
"\n",
- "1. **Introduction**\n",
- "2. **Background and Related Work**\n",
- "3. **Multi-Agent Architecture Overview**\n",
- "4. **Techniques for Task Decomposition**\n",
- "5. **Coordination and Communication Among Agents**\n",
- "6. **Applications of Multi-Agent Systems**\n",
- "7. **Challenges and Future Directions**\n",
- "8. **Conclusion**\n",
- "9. **References**\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mWebScraper\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: scrape_webpages\n",
"\n",
- "Let's proceed with writing detailed content for each section.\n",
+ "\n",
"\n",
- "---\n",
"\n",
- "# How to Perform Complex Tasks Using Multi-Agent Architecture\n",
+ " arXiv.org e-Print archive\n",
"\n",
- "## 1. Introduction\n",
"\n",
- "The introduction of multi-agent systems (MAS) has revolutionized the way complex tasks are approached in computational environments. These systems consist of multiple interacting agents, each capable of autonomous action to achieve specific goals. The primary advantage of MAS lies in their ability to handle tasks that are too complex for a single agent or system to manage. This paper explores the methodologies and architectures that enable multi-agent systems to perform complex tasks efficiently. We will delve into the principles of task decomposition, coordination, and communication that are essential for the successful implementation of MAS.\n",
"\n",
- "## 2. Background and Related Work\n",
"\n",
- "The concept of multi-agent systems has its roots in distributed artificial intelligence (DAI), which emerged in the 1980s. Early research focused on the development of autonomous agents capable of decision-making and problem-solving. Over the years, the field has expanded to include various architectures and frameworks that facilitate agent interaction. Notable contributions include the Contract Net Protocol, which outlines a negotiation-based approach for task allocation among agents. Recent advancements have seen the integration of machine learning techniques to enhance agent capabilities and adaptability.\n",
"\n",
- "## 3. Multi-Agent Architecture Overview\n",
"\n",
- "Multi-agent architecture refers to the structural design that defines how agents are organized and interact within a system. There are several types of architectures, including hierarchical, flat, and hybrid models. Hierarchical architectures involve a top-down approach where a central agent coordinates the activities of subordinate agents. In contrast, flat architectures promote equal status among agents, allowing for decentralized decision-making. Hybrid models combine elements of both, offering flexibility in task execution. The choice of architecture depends on the complexity of the task and the desired level of agent autonomy.\n",
"\n",
- "## 4. Techniques for Task Decomposition\n",
"\n",
- "Task decomposition is a critical step in enabling multi-agent systems to handle complex tasks. It involves breaking down a large task into smaller, manageable sub-tasks that can be distributed among agents. Techniques such as goal-oriented decomposition and role-based allocation are commonly used. Goal-oriented decomposition focuses on dividing tasks based on the desired outcomes, while role-based allocation assigns tasks based on the specific capabilities of each agent. Effective task decomposition ensures that agents can work concurrently, reducing the overall time required for task completion.\n",
"\n",
- "## 5. Coordination and Communication Among Agents\n",
"\n",
- "Coordination and communication are vital components of multi-agent systems, ensuring that agents work harmoniously towards a common goal. Coordination involves the synchronization of agent activities to prevent conflicts and redundancies. Communication protocols, such as message passing and blackboard systems, facilitate information exchange among agents. These protocols enable agents to share knowledge, negotiate roles, and make collective decisions. The efficiency of coordination and communication mechanisms directly impacts the performance of the multi-agent system.\n",
"\n",
- "## 6. Applications of Multi-Agent Systems\n",
"\n",
- "Multi-agent systems have found applications in various domains, including robotics, logistics, and healthcare. In robotics, MAS are used to coordinate the actions of multiple robots in tasks such as search and rescue operations. In logistics, they optimize supply chain management by coordinating the movement of goods and resources. In healthcare, MAS assist in patient monitoring and personalized treatment planning. The versatility of multi-agent systems makes them suitable for any scenario that requires distributed problem-solving and decision-making.\n",
"\n",
- "## 7. Challenges and Future Directions\n",
"\n",
- "Despite their advantages, multi-agent systems face several challenges, including scalability, security, and interoperability. As the number of agents increases, so does the complexity of coordination and communication. Ensuring the security of agent interactions is crucial, especially in sensitive applications. Interoperability between different MAS frameworks remains a significant hurdle. Future research is focused on addressing these challenges through the development of more robust architectures and protocols. The integration of advanced AI techniques, such as deep learning, holds promise for enhancing agent intelligence and adaptability.\n",
"\n",
- "## 8. Conclusion\n",
"\n",
- "Multi-agent systems offer a powerful approach to tackling complex tasks by leveraging the collective capabilities of autonomous agents. Through effective task decomposition, coordination, and communication, MAS can achieve results that are beyond the reach of individual agents. While challenges remain, ongoing research and technological advancements continue to push the boundaries of what is possible with multi-agent systems. As these systems evolve, they will undoubtedly play an increasingly important role in various fields, driving innovation and efficiency.\n",
+ "Skip to main content\n",
"\n",
- "## 9. References\n",
"\n",
- "- Jennings, N. R., & Wooldridge, M. (1998). Applications of intelligent agents. In Agent technology (pp. 3-28). Springer, Berlin, Heidelberg.\n",
- "- Weiss, G. (Ed.). (1999). Multiagent systems: a modern approach to distributed artificial intelligence. MIT press.\n",
- "- Sandholm, T. (1999). Distributed rational decision making. In Multiagent systems (pp. 201-258). MIT press.\n",
- "- Stone, P., & Veloso, M. (2000). Multiagent systems: A survey from a machine learning perspective. Autonomous Robots, 8(3), 345-383.\n",
- "- Wooldridge, M. (2009). An introduction to multiagent systems. John Wiley & Sons.\n",
"\n",
- "---\n",
"\n",
- "This content will be saved as a Markdown (.md) file.\n",
- "==================================================\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "==================================================\n",
+ "In just 3 minutes help us improve arXiv:\n",
+ "Annual Global Survey\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mResearchTeam\u001b[0m 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "We gratefully acknowledge support from the Simons Foundation, member institutions, and all contributors.\n",
+ "Donate\n",
"\n",
- "user: To create a report in Arxiv paper format on the topic of \"How to Perform Complex Tasks Using Multi-Agent Architecture,\" we will follow a structured approach. Here's an outline for the report:\n",
"\n",
- "1. **Introduction**\n",
- "2. **Background and Related Work**\n",
- "3. **Multi-Agent Architecture Overview**\n",
- "4. **Techniques for Task Decomposition**\n",
- "5. **Coordination and Communication Among Agents**\n",
- "6. **Applications of Multi-Agent Systems**\n",
- "7. **Challenges and Future Directions**\n",
- "8. **Conclusion**\n",
- "9. **References**\n",
"\n",
- "Let's proceed with writing detailed content for each section.\n",
"\n",
- "---\n",
"\n",
- "# How to Perform Complex Tasks Using Multi-Agent Architecture\n",
+ "Login\n",
+ " \n",
"\n",
- "## 1. Introduction\n",
"\n",
- "The introduction of multi-agent systems (MAS) has revolutionized the way complex tasks are approached in computational environments. These systems consist of multiple interacting agents, each capable of autonomous action to achieve specific goals. The primary advantage of MAS lies in their ability to handle tasks that are too complex for a single agent or system to manage. This paper explores the methodologies and architectures that enable multi-agent systems to perform complex tasks efficiently. We will delve into the principles of task decomposition, coordination, and communication that are essential for the successful implementation of MAS.\n",
"\n",
- "## 2. Background and Related Work\n",
"\n",
- "The concept of multi-agent systems has its roots in distributed artificial intelligence (DAI), which emerged in the 1980s. Early research focused on the development of autonomous agents capable of decision-making and problem-solving. Over the years, the field has expanded to include various architectures and frameworks that facilitate agent interaction. Notable contributions include the Contract Net Protocol, which outlines a negotiation-based approach for task allocation among agents. Recent advancements have seen the integration of machine learning techniques to enhance agent capabilities and adaptability.\n",
+ "Help | Advanced Search\n",
"\n",
- "## 3. Multi-Agent Architecture Overview\n",
"\n",
- "Multi-agent architecture refers to the structural design that defines how agents are organized and interact within a system. There are several types of architectures, including hierarchical, flat, and hybrid models. Hierarchical architectures involve a top-down approach where a central agent coordinates the activities of subordinate agents. In contrast, flat architectures promote equal status among agents, allowing for decentralized decision-making. Hybrid models combine elements of both, offering flexibility in task execution. The choice of architecture depends on the complexity of the task and the desired level of agent autonomy.\n",
"\n",
- "## 4. Techniques for Task Decomposition\n",
"\n",
- "Task decomposition is a critical step in enabling multi-agent systems to handle complex tasks. It involves breaking down a large task into smaller, manageable sub-tasks that can be distributed among agents. Techniques such as goal-oriented decomposition and role-based allocation are commonly used. Goal-oriented decomposition focuses on dividing tasks based on the desired outcomes, while role-based allocation assigns tasks based on the specific capabilities of each agent. Effective task decomposition ensures that agents can work concurrently, reducing the overall time required for task completion.\n",
+ "All fields\n",
+ "Title\n",
+ "Author\n",
+ "Abstract\n",
+ "Comments\n",
+ "Journal reference\n",
+ "ACM classification\n",
+ "MSC classification\n",
+ "Report number\n",
+ "arXiv identifier\n",
+ "DOI\n",
+ "ORCID\n",
+ "arXiv author ID\n",
+ "Help pages\n",
+ "Full text\n",
"\n",
- "## 5. Coordination and Communication Among Agents\n",
"\n",
- "Coordination and communication are vital components of multi-agent systems, ensuring that agents work harmoniously towards a common goal. Coordination involves the synchronization of agent activities to prevent conflicts and redundancies. Communication protocols, such as message passing and blackboard systems, facilitate information exchange among agents. These protocols enable agents to share knowledge, negotiate roles, and make collective decisions. The efficiency of coordination and communication mechanisms directly impacts the performance of the multi-agent system.\n",
"\n",
- "## 6. Applications of Multi-Agent Systems\n",
"\n",
- "Multi-agent systems have found applications in various domains, including robotics, logistics, and healthcare. In robotics, MAS are used to coordinate the actions of multiple robots in tasks such as search and rescue operations. In logistics, they optimize supply chain management by coordinating the movement of goods and resources. In healthcare, MAS assist in patient monitoring and personalized treatment planning. The versatility of multi-agent systems makes them suitable for any scenario that requires distributed problem-solving and decision-making.\n",
+ "Search\n",
"\n",
- "## 7. Challenges and Future Directions\n",
"\n",
- "Despite their advantages, multi-agent systems face several challenges, including scalability, security, and interoperability. As the number of agents increases, so does the complexity of coordination and communication. Ensuring the security of agent interactions is crucial, especially in sensitive applications. Interoperability between different MAS frameworks remains a significant hurdle. Future research is focused on addressing these challenges through the development of more robust architectures and protocols. The integration of advanced AI techniques, such as deep learning, holds promise for enhancing agent intelligence and adaptability.\n",
"\n",
- "## 8. Conclusion\n",
"\n",
- "Multi-agent systems offer a powerful approach to tackling complex tasks by leveraging the collective capabilities of autonomous agents. Through effective task decomposition, coordination, and communication, MAS can achieve results that are beyond the reach of individual agents. While challenges remain, ongoing research and technological advancements continue to push the boundaries of what is possible with multi-agent systems. As these systems evolve, they will undoubtedly play an increasingly important role in various fields, driving innovation and efficiency.\n",
"\n",
- "## 9. References\n",
"\n",
- "- Jennings, N. R., & Wooldridge, M. (1998). Applications of intelligent agents. In Agent technology (pp. 3-28). Springer, Berlin, Heidelberg.\n",
- "- Weiss, G. (Ed.). (1999). Multiagent systems: a modern approach to distributed artificial intelligence. MIT press.\n",
- "- Sandholm, T. (1999). Distributed rational decision making. In Multiagent systems (pp. 201-258). MIT press.\n",
- "- Stone, P., & Veloso, M. (2000). Multiagent systems: A survey from a machine learning perspective. Autonomous Robots, 8(3), 345-383.\n",
- "- Wooldridge, M. (2009). An introduction to multiagent systems. John Wiley & Sons.\n",
"\n",
- "---\n",
"\n",
- "This content will be saved as a Markdown (.md) file.\n",
- "==================================================\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "==================================================\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "==================================================\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mResearchTeam\u001b[0m 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "To create a report in Arxiv paper format on the topic of \"How to Perform Complex Tasks Using Multi-Agent Architecture,\" we will follow a structured approach. Here's an outline for the report:\n",
"\n",
- "1. **Introduction**\n",
- "2. **Background and Related Work**\n",
- "3. **Multi-Agent Architecture Overview**\n",
- "4. **Techniques for Task Decomposition**\n",
- "5. **Coordination and Communication Among Agents**\n",
- "6. **Applications of Multi-Agent Systems**\n",
- "7. **Challenges and Future Directions**\n",
- "8. **Conclusion**\n",
- "9. **References**\n",
"\n",
- "Let's proceed with writing detailed content for each section.\n",
"\n",
- "---\n",
"\n",
- "# How to Perform Complex Tasks Using Multi-Agent Architecture\n",
+ "open search\n",
"\n",
- "## 1. Introduction\n",
"\n",
- "The introduction of multi-agent systems (MAS) has revolutionized the way complex tasks are approached in computational environments. These systems consist of multiple interacting agents, each capable of autonomous action to achieve specific goals. The primary advantage of MAS lies in their ability to handle tasks that are too complex for a single agent or system to manage. This paper explores the methodologies and architectures that enable multi-agent systems to perform complex tasks efficiently. We will delve into the principles of task decomposition, coordination, and communication that are essential for the successful implementation of MAS.\n",
"\n",
- "## 2. Background and Related Work\n",
"\n",
- "The concept of multi-agent systems has its roots in distributed artificial intelligence (DAI), which emerged in the 1980s. Early research focused on the development of autonomous agents capable of decision-making and problem-solving. Over the years, the field has expanded to include various architectures and frameworks that facilitate agent interaction. Notable contributions include the Contract Net Protocol, which outlines a negotiation-based approach for task allocation among agents. Recent advancements have seen the integration of machine learning techniques to enhance agent capabilities and adaptability.\n",
"\n",
- "## 3. Multi-Agent Architecture Overview\n",
"\n",
- "Multi-agent architecture refers to the structural design that defines how agents are organized and interact within a system. There are several types of architectures, including hierarchical, flat, and hybrid models. Hierarchical architectures involve a top-down approach where a central agent coordinates the activities of subordinate agents. In contrast, flat architectures promote equal status among agents, allowing for decentralized decision-making. Hybrid models combine elements of both, offering flexibility in task execution. The choice of architecture depends on the complexity of the task and the desired level of agent autonomy.\n",
+ "GO\n",
"\n",
- "## 4. Techniques for Task Decomposition\n",
"\n",
- "Task decomposition is a critical step in enabling multi-agent systems to handle complex tasks. It involves breaking down a large task into smaller, manageable sub-tasks that can be distributed among agents. Techniques such as goal-oriented decomposition and role-based allocation are commonly used. Goal-oriented decomposition focuses on dividing tasks based on the desired outcomes, while role-based allocation assigns tasks based on the specific capabilities of each agent. Effective task decomposition ensures that agents can work concurrently, reducing the overall time required for task completion.\n",
"\n",
- "## 5. Coordination and Communication Among Agents\n",
+ "open navigation menu\n",
"\n",
- "Coordination and communication are vital components of multi-agent systems, ensuring that agents work harmoniously towards a common goal. Coordination involves the synchronization of agent activities to prevent conflicts and redundancies. Communication protocols, such as message passing and blackboard systems, facilitate information exchange among agents. These protocols enable agents to share knowledge, negotiate roles, and make collective decisions. The efficiency of coordination and communication mechanisms directly impacts the performance of the multi-agent system.\n",
"\n",
- "## 6. Applications of Multi-Agent Systems\n",
+ "quick links\n",
"\n",
- "Multi-agent systems have found applications in various domains, including robotics, logistics, and healthcare. In robotics, MAS are used to coordinate the actions of multiple robots in tasks such as search and rescue operations. In logistics, they optimize supply chain management by coordinating the movement of goods and resources. In healthcare, MAS assist in patient monitoring and personalized treatment planning. The versatility of multi-agent systems makes them suitable for any scenario that requires distributed problem-solving and decision-making.\n",
+ "Login\n",
+ "Help Pages\n",
+ "About\n",
"\n",
- "## 7. Challenges and Future Directions\n",
"\n",
- "Despite their advantages, multi-agent systems face several challenges, including scalability, security, and interoperability. As the number of agents increases, so does the complexity of coordination and communication. Ensuring the security of agent interactions is crucial, especially in sensitive applications. Interoperability between different MAS frameworks remains a significant hurdle. Future research is focused on addressing these challenges through the development of more robust architectures and protocols. The integration of advanced AI techniques, such as deep learning, holds promise for enhancing agent intelligence and adaptability.\n",
"\n",
- "## 8. Conclusion\n",
"\n",
- "Multi-agent systems offer a powerful approach to tackling complex tasks by leveraging the collective capabilities of autonomous agents. Through effective task decomposition, coordination, and communication, MAS can achieve results that are beyond the reach of individual agents. While challenges remain, ongoing research and technological advancements continue to push the boundaries of what is possible with multi-agent systems. As these systems evolve, they will undoubtedly play an increasingly important role in various fields, driving innovation and efficiency.\n",
"\n",
- "## 9. References\n",
"\n",
- "- Jennings, N. R., & Wooldridge, M. (1998). Applications of intelligent agents. In Agent technology (pp. 3-28). Springer, Berlin, Heidelberg.\n",
- "- Weiss, G. (Ed.). (1999). Multiagent systems: a modern approach to distributed artificial intelligence. MIT press.\n",
- "- Sandholm, T. (1999). Distributed rational decision making. In Multiagent systems (pp. 201-258). MIT press.\n",
- "- Stone, P., & Veloso, M. (2000). Multiagent systems: A survey from a machine learning perspective. Autonomous Robots, 8(3), 345-383.\n",
- "- Wooldridge, M. (2009). An introduction to multiagent systems. John Wiley & Sons.\n",
"\n",
- "---\n",
"\n",
- "This content will be saved as a Markdown (.md) file.\n",
- "==================================================\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "==================================================\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mPaperWritingTeam\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "==================================================\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "arXiv is a free distribution service and an open-access archive for nearly 2.4 million\n",
+ " scholarly articles in the fields of physics, mathematics, computer science, quantitative biology, quantitative finance, statistics, electrical engineering and systems science, and economics.\n",
+ " Materials on this site are not peer-reviewed by arXiv.\n",
+ " \n",
"\n",
- "assistant: \n",
- "==================================================\n",
+ "Subject search and browse:\n",
+ " \n",
+ " Physics\n",
+ " \n",
+ " Mathematics\n",
+ " \n",
+ " Quantitative Biology\n",
+ " \n",
+ " Computer Science\n",
+ " \n",
+ " Quantitative Finance\n",
+ " \n",
+ " Statistics\n",
+ " \n",
+ " Electrical Engineering and Systems Science\n",
+ " \n",
+ " Economics\n",
+ " \n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"\n",
- "assistant: Document saved to Multi_Agent_Architecture_Report.md\n",
- "==================================================\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ " Physics\n",
+ "\n",
+ "Astrophysics\n",
+ " (astro-ph\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ "\n",
+ "Astrophysics of Galaxies; Cosmology and Nongalactic Astrophysics; Earth and Planetary Astrophysics; High Energy Astrophysical Phenomena; Instrumentation and Methods for Astrophysics; Solar and Stellar Astrophysics \n",
+ "\n",
+ "Condensed Matter\n",
+ " (cond-mat\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ "\n",
+ "Disordered Systems and Neural Networks; Materials Science; Mesoscale and Nanoscale Physics; Other Condensed Matter; Quantum Gases; Soft Condensed Matter; Statistical Mechanics; Strongly Correlated Electrons; Superconductivity \n",
+ "\n",
+ "General Relativity and Quantum Cosmology\n",
+ " (gr-qc\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ "\n",
+ " \n",
+ "\n",
+ "High Energy Physics - Experiment\n",
+ " (hep-ex\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ "\n",
+ " \n",
+ "\n",
+ "High Energy Physics - Lattice\n",
+ " (hep-lat\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ "\n",
+ " \n",
+ "\n",
+ "High Energy Physics - Phenomenology\n",
+ " (hep-ph\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ "\n",
+ " \n",
+ "\n",
+ "High Energy Physics - Theory\n",
+ " (hep-th\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ "\n",
+ " \n",
+ "\n",
+ "Mathematical Physics\n",
+ " (math-ph\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ "\n",
+ " \n",
+ "\n",
+ "Nonlinear Sciences\n",
+ " (nlin\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ " includes:\n",
+ "\n",
+ "Adaptation and Self-Organizing Systems; Cellular Automata and Lattice Gases; Chaotic Dynamics; Exactly Solvable and Integrable Systems; Pattern Formation and Solitons \n",
+ "\n",
+ "Nuclear Experiment\n",
+ " (nucl-ex\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ "\n",
+ " \n",
+ "\n",
+ "Nuclear Theory\n",
+ " (nucl-th\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ "\n",
+ " \n",
+ "\n",
+ "Physics\n",
+ " (physics\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ " includes:\n",
+ "\n",
+ "Accelerator Physics; Applied Physics; Atmospheric and Oceanic Physics; Atomic and Molecular Clusters; Atomic Physics; Biological Physics; Chemical Physics; Classical Physics; Computational Physics; Data Analysis, Statistics and Probability; Fluid Dynamics; General Physics; Geophysics; History and Philosophy of Physics; Instrumentation and Detectors; Medical Physics; Optics; Physics and Society; Physics Education; Plasma Physics; Popular Physics; Space Physics \n",
+ "\n",
+ "Quantum Physics\n",
+ " (quant-ph\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ "\n",
+ " \n",
+ "\n",
+ "Mathematics\n",
+ "\n",
+ "Mathematics\n",
+ " (math\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ " includes: (see detailed description):\n",
+ "\n",
+ "Algebraic Geometry; Algebraic Topology; Analysis of PDEs; Category Theory; Classical Analysis and ODEs; Combinatorics; Commutative Algebra; Complex Variables; Differential Geometry; Dynamical Systems; Functional Analysis; General Mathematics; General Topology; Geometric Topology; Group Theory; History and Overview; Information Theory; K-Theory and Homology; Logic; Mathematical Physics; Metric Geometry; Number Theory; Numerical Analysis; Operator Algebras; Optimization and Control; Probability; Quantum Algebra; Representation Theory; Rings and Algebras; Spectral Theory; Statistics Theory; Symplectic Geometry \n",
+ "\n",
+ "Computer Science\n",
+ "\n",
+ "Computing Research Repository\n",
+ " (CoRR\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ " includes: (see detailed description):\n",
+ "\n",
+ "Artificial Intelligence; Computation and Language; Computational Complexity; Computational Engineering, Finance, and Science; Computational Geometry; Computer Science and Game Theory; Computer Vision and Pattern Recognition; Computers and Society; Cryptography and Security; Data Structures and Algorithms; Databases; Digital Libraries; Discrete Mathematics; Distributed, Parallel, and Cluster Computing; Emerging Technologies; Formal Languages and Automata Theory; General Literature; Graphics; Hardware Architecture; Human-Computer Interaction; Information Retrieval; Information Theory; Logic in Computer Science; Machine Learning; Mathematical Software; Multiagent Systems; Multimedia; Networking and Internet Architecture; Neural and Evolutionary Computing; Numerical Analysis; Operating Systems; Other Computer Science; Performance; Programming Languages; Robotics; Social and Information Networks; Software Engineering; Sound; Symbolic Computation; Systems and Control \n",
+ "\n",
+ "Quantitative Biology\n",
+ "\n",
+ "Quantitative Biology\n",
+ " (q-bio\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ " includes: (see detailed description):\n",
+ "\n",
+ "Biomolecules; Cell Behavior; Genomics; Molecular Networks; Neurons and Cognition; Other Quantitative Biology; Populations and Evolution; Quantitative Methods; Subcellular Processes; Tissues and Organs \n",
+ "\n",
+ "Quantitative Finance\n",
+ "\n",
+ "Quantitative Finance\n",
+ " (q-fin\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ " includes: (see detailed description):\n",
+ "\n",
+ "Computational Finance; Economics; General Finance; Mathematical Finance; Portfolio Management; Pricing of Securities; Risk Management; Statistical Finance; Trading and Market Microstructure \n",
+ "\n",
+ "Statistics\n",
+ "\n",
+ "Statistics\n",
+ " (stat\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ " includes: (see detailed description):\n",
+ "\n",
+ "Applications; Computation; Machine Learning; Methodology; Other Statistics; Statistics Theory \n",
+ "\n",
+ "Electrical Engineering and Systems Science\n",
+ "\n",
+ "Electrical Engineering and Systems Science\n",
+ " (eess\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ " includes: (see detailed description):\n",
+ "\n",
+ "Audio and Speech Processing; Image and Video Processing; Signal Processing; Systems and Control \n",
+ "\n",
+ "Economics\n",
+ "\n",
+ "Economics\n",
+ " (econ\n",
+ "new,\n",
+ " recent,\n",
+ " search)\n",
+ " includes: (see detailed description):\n",
+ "\n",
+ "Econometrics; General Economics; Theoretical Economics \n",
+ "\n",
+ "\n",
+ "About arXiv\n",
+ "\n",
+ "General information\n",
+ "How to Submit to arXiv\n",
+ "Membership & Giving\n",
+ "Who We Are\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "About\n",
+ "Help\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "contact arXivClick here to contact arXiv\n",
+ " Contact\n",
+ "\n",
+ "\n",
+ "subscribe to arXiv mailingsClick here to subscribe\n",
+ " Subscribe\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Copyright\n",
+ "Privacy Policy\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Web Accessibility Assistance\n",
+ "\n",
+ "\n",
+ "arXiv Operational Status \n",
+ " Get status notifications via\n",
+ " email\n",
+ " or slack\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ " \n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks - Microsoft Research\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ " \n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Skip to main content\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Microsoft\n",
+ "\n",
+ "\n",
+ "\n",
+ "Research\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Research\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ " Research\n",
+ " \n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ " Home \n",
+ "\n",
+ "\n",
+ "\n",
+ "Our research\n",
+ "\n",
+ "\n",
+ "Resources\n",
+ "Resources\n",
+ "\n",
+ "\n",
+ "Publications\n",
+ "\n",
+ "\n",
+ "Code & data\n",
+ "\n",
+ "\n",
+ "People\n",
+ "\n",
+ "\n",
+ "Microsoft Research blog\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Research areas: Intelligence\n",
+ "Research areas: Intelligence\n",
+ "\n",
+ "\n",
+ "Artificial intelligence\n",
+ "\n",
+ "\n",
+ "Audio & acoustics\n",
+ "\n",
+ "\n",
+ "Computer vision\n",
+ "\n",
+ "\n",
+ "Graphics & multimedia\n",
+ "\n",
+ "\n",
+ "Human-computer interaction\n",
+ "\n",
+ "\n",
+ "Human language technologies\n",
+ "\n",
+ "\n",
+ "Search & information retrieval\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Research areas: Systems\n",
+ "Research areas: Systems\n",
+ "\n",
+ "\n",
+ "Data platforms and analytics\n",
+ "\n",
+ "\n",
+ "Hardware & devices\n",
+ "\n",
+ "\n",
+ "Programming languages & software engineering\n",
+ "\n",
+ "\n",
+ "Quantum computing\n",
+ "\n",
+ "\n",
+ "Security, privacy & cryptography\n",
+ "\n",
+ "\n",
+ "Systems & networking\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Research areas: Theory\n",
+ "Research areas: Theory\n",
+ "\n",
+ "\n",
+ "Algorithms\n",
+ "\n",
+ "\n",
+ "Mathematics\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Research areas: Other Sciences\n",
+ "Research areas: Other Sciences\n",
+ "\n",
+ "\n",
+ "Ecology & environment\n",
+ "\n",
+ "\n",
+ "Economics\n",
+ "\n",
+ "\n",
+ "Medical, health & genomics\n",
+ "\n",
+ "\n",
+ "Social sciences\n",
+ "\n",
+ "\n",
+ "Technology for emerging markets\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ " \n",
+ "\n",
+ "Programs & events\n",
+ "\n",
+ "\n",
+ "Academic programs\n",
+ "\n",
+ "\n",
+ "Events & academic conferences\n",
+ "\n",
+ "\n",
+ "Microsoft Research Forum\n",
+ "\n",
+ "\n",
+ "\n",
+ " \n",
+ "\n",
+ "Connect & learn\n",
+ "\n",
+ "\n",
+ "Behind the Tech podcast\n",
+ "\n",
+ "\n",
+ "Microsoft Research blog\n",
+ "\n",
+ "\n",
+ "Microsoft Research Forum\n",
+ "\n",
+ "\n",
+ "Microsoft Research podcast\n",
+ "\n",
+ "\n",
+ "\n",
+ " \n",
+ "\n",
+ "About\n",
+ "\n",
+ "\n",
+ "People & news\n",
+ "People & news\n",
+ "\n",
+ "\n",
+ "About Microsoft Research\n",
+ "\n",
+ "\n",
+ "Careers & internships\n",
+ "\n",
+ "\n",
+ "People\n",
+ "\n",
+ "\n",
+ "Emeritus program\n",
+ "\n",
+ "\n",
+ "News & awards\n",
+ "\n",
+ "\n",
+ "Microsoft Research newsletter\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Microsoft Research Labs\n",
+ "Microsoft Research Labs\n",
+ "\n",
+ "\n",
+ "Africa\n",
+ "\n",
+ "\n",
+ "AI for Science\n",
+ "\n",
+ "\n",
+ "AI Frontiers\n",
+ "\n",
+ "\n",
+ "Asia-Pacific\n",
+ "\n",
+ "\n",
+ "Cambridge\n",
+ "\n",
+ "\n",
+ "Health Futures\n",
+ "\n",
+ "\n",
+ "India\n",
+ "\n",
+ "\n",
+ "Montreal\n",
+ "\n",
+ "\n",
+ "New England\n",
+ "\n",
+ "\n",
+ "New York City\n",
+ "\n",
+ "\n",
+ "Redmond\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Other labs\n",
+ "Other labs\n",
+ "\n",
+ "\n",
+ "Applied Sciences\n",
+ "\n",
+ "\n",
+ "Mixed Reality & AI - Cambridge\n",
+ "\n",
+ "\n",
+ "Mixed Reality & AI - Zurich\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "More\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Register: Research Forum\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ " \n",
+ "\n",
+ "\n",
+ "\n",
+ " All Microsoft\n",
+ "\n",
+ "\n",
+ "Global\n",
+ "\n",
+ "\n",
+ "Microsoft Security\n",
+ "\n",
+ "\n",
+ "Azure\n",
+ "\n",
+ "\n",
+ "Dynamics 365\n",
+ "\n",
+ "\n",
+ "Microsoft 365\n",
+ "\n",
+ "\n",
+ "Microsoft Teams\n",
+ "\n",
+ "\n",
+ "Windows 365\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Tech & innovation\n",
+ "Tech & innovation\n",
+ "\n",
+ "\n",
+ "Microsoft Cloud\n",
+ "\n",
+ "\n",
+ "AI\n",
+ "\n",
+ "\n",
+ "Azure Space\n",
+ "\n",
+ "\n",
+ "Mixed reality\n",
+ "\n",
+ "\n",
+ "Microsoft HoloLens\n",
+ "\n",
+ "\n",
+ "Microsoft Viva\n",
+ "\n",
+ "\n",
+ "Quantum computing\n",
+ "\n",
+ "\n",
+ "Sustainability\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Industries\n",
+ "Industries\n",
+ "\n",
+ "\n",
+ "Education\n",
+ "\n",
+ "\n",
+ "Automotive\n",
+ "\n",
+ "\n",
+ "Financial services\n",
+ "\n",
+ "\n",
+ "Government\n",
+ "\n",
+ "\n",
+ "Healthcare\n",
+ "\n",
+ "\n",
+ "Manufacturing\n",
+ "\n",
+ "\n",
+ "Retail\n",
+ "\n",
+ "\n",
+ "All industries\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Partners\n",
+ "Partners\n",
+ "\n",
+ "\n",
+ "Find a partner\n",
+ "\n",
+ "\n",
+ "Become a partner\n",
+ "\n",
+ "\n",
+ "Partner Network\n",
+ "\n",
+ "\n",
+ "Azure Marketplace\n",
+ "\n",
+ "\n",
+ "AppSource\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Resources\n",
+ "Resources\n",
+ "\n",
+ "\n",
+ "Blog\n",
+ "\n",
+ "\n",
+ "Microsoft Advertising\n",
+ "\n",
+ "\n",
+ "Developer Center\n",
+ "\n",
+ "\n",
+ "Documentation\n",
+ "\n",
+ "\n",
+ "Events\n",
+ "\n",
+ "\n",
+ "Licensing\n",
+ "\n",
+ "\n",
+ "Microsoft Learn\n",
+ "\n",
+ "\n",
+ "Microsoft Research\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "View Sitemap\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Search\n",
+ "Search Microsoft Research\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ " No results\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Cancel\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ " \n",
+ "\n",
+ "\n",
+ "\n",
+ "Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Adam Fourney\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Gagan Bansal\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Hussein Mozannar\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Cheng Tan\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Eduardo Salinas\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Erkang (Eric) Zhu\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Friederike Niedtner\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "Grace Proebsting\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Griffin Bassman\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Jack Gerrits\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Jacob Alber\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Peter Chang\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Ricky Loynd\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Robert West\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Victor Dibia\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Ahmed Awadallah\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Ece Kamar\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Rafah Hosn\n",
+ "\n",
+ "\t\t\t\t\t\t,\t\t\n",
+ "\n",
+ "\n",
+ "Saleema Amershi\n",
+ "\n",
+ "\n",
+ "\n",
+ "MSR-TR-2024-47 | November 2024\n",
+ "Published by Microsoft\n",
+ "\n",
+ "\n",
+ "\n",
+ "Download BibTex\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Modern AI agents, driven by advances in large foundation models, promise to enhance our productivity and transform our lives by augmenting our knowledge and capabilities. To achieve this vision, AI agents must effectively plan, perform multi-step reasoning and actions, respond to novel observations, and recover from errors, to successfully complete complex tasks across a wide range of scenarios. In this work, we introduce Magentic-One, a high-performing open-source agentic system for solving such tasks. Magentic-One uses a multi-agent architecture where a lead agent, the Orchestrator, plans, tracks progress, and re-plans to recover from errors. Throughout task execution, the Orchestrator also directs other specialized agents to perform tasks as needed, such as operating a web browser, navigating local files, or writing and executing Python code. Our experiments show that Magentic-One achieves statistically competitive performance to the state-of-the-art on three diverse and challenging agentic benchmarks: GAIA, AssistantBench, and WebArena. Notably, Magentic-One achieves these results without modification to core agent capabilities or to how they collaborate, demonstrating progress towards the vision of generalist agentic systems. Moreover, Magentic-One’s modular design allows agents to be added or removed from the team without additional prompt tuning or training, easing development and making it extensible to future scenarios. We provide an open-source implementation of Magentic-One and AutoGenBench, a standalone agentic evaluation tool. AutoGenBench provides built-in controls for repetition and isolation to run agentic benchmarks where actions may produce side-effects, in a rigorous and contained way. Magentic-One, AutoGenBench and detailed empirical performance evaluations of Magentic-One, including ablations and error analysis are available at https://aka.ms/magentic-one (opens in new tab).\n",
+ "Opens in a new tab \n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\t\t\t\t\t\tPreprint\t\t\t\t\t\n",
+ "\n",
+ "\n",
+ "\t\t\t\t\t\t\tGroups\t\t\t\t\t\t\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\t\t\t\t\t\t\t\t\t\t\tHAX Team\t\t\t\t\t\t\t\t\t\t\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\t\t\t\t\t\t\tResearch Areas\t\t\t\t\t\t\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\t\t\t\t\t\t\t\t\t\t\tArtificial intelligence\t\t\t\t\t\t\t\t\t\t\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\t\t\t\t\t\t\tResearch Labs\t\t\t\t\t\t\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\t\t\t\t\t\t\t\t\t\t\tAI Frontiers\t\t\t\t\t\t\t\t\t\t\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\t\t\t\t\t\t\tFollow us:\t\t\t\t\t\t\n",
+ "\n",
+ "\n",
+ "\n",
+ "Follow on X\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Like on Facebook\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Follow on LinkedIn\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Subscribe on Youtube\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Follow on Instagram\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Subscribe to our RSS feed\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\t\t\t\t\t\t\tShare this page:\t\t\t\t\t\t\n",
+ "\n",
+ "\n",
+ "\n",
+ "Share on X\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Share on Facebook\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Share on LinkedIn\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Share on Reddit\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "What's new\n",
+ "\n",
+ "\n",
+ "Surface Pro\n",
+ "\n",
+ "\n",
+ "Surface Laptop\n",
+ "\n",
+ "\n",
+ "Surface Laptop Studio 2\n",
+ "\n",
+ "\n",
+ "Surface Laptop Go 3\n",
+ "\n",
+ "\n",
+ "Microsoft Copilot\n",
+ "\n",
+ "\n",
+ "AI in Windows\n",
+ "\n",
+ "\n",
+ "Explore Microsoft products\n",
+ "\n",
+ "\n",
+ "Windows 11 apps\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Microsoft Store\n",
+ "\n",
+ "\n",
+ "Account profile\n",
+ "\n",
+ "\n",
+ "Download Center\n",
+ "\n",
+ "\n",
+ "Microsoft Store support\n",
+ "\n",
+ "\n",
+ "Returns\n",
+ "\n",
+ "\n",
+ "Order tracking\n",
+ "\n",
+ "\n",
+ "Certified Refurbished\n",
+ "\n",
+ "\n",
+ "Microsoft Store Promise\n",
+ "\n",
+ "\n",
+ "Flexible Payments\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Education\n",
+ "\n",
+ "\n",
+ "Microsoft in education\n",
+ "\n",
+ "\n",
+ "Devices for education\n",
+ "\n",
+ "\n",
+ "Microsoft Teams for Education\n",
+ "\n",
+ "\n",
+ "Microsoft 365 Education\n",
+ "\n",
+ "\n",
+ "How to buy for your school\n",
+ "\n",
+ "\n",
+ "Educator training and development\n",
+ "\n",
+ "\n",
+ "Deals for students and parents\n",
+ "\n",
+ "\n",
+ "Azure for students\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Business\n",
+ "\n",
+ "\n",
+ "Microsoft Cloud\n",
+ "\n",
+ "\n",
+ "Microsoft Security\n",
+ "\n",
+ "\n",
+ "Dynamics 365\n",
+ "\n",
+ "\n",
+ "Microsoft 365\n",
+ "\n",
+ "\n",
+ "Microsoft Power Platform\n",
+ "\n",
+ "\n",
+ "Microsoft Teams\n",
+ "\n",
+ "\n",
+ "Microsoft 365 Copilot\n",
+ "\n",
+ "\n",
+ "Small Business\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Developer & IT\n",
+ "\n",
+ "\n",
+ "Azure\n",
+ "\n",
+ "\n",
+ "Microsoft Developer\n",
+ "\n",
+ "\n",
+ "Documentation\n",
+ "\n",
+ "\n",
+ "Microsoft Learn\n",
+ "\n",
+ "\n",
+ "Microsoft Tech Community\n",
+ "\n",
+ "\n",
+ "Azure Marketplace\n",
+ "\n",
+ "\n",
+ "AppSource\n",
+ "\n",
+ "\n",
+ "Visual Studio\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Company\n",
+ "\n",
+ "\n",
+ "Careers\n",
+ "\n",
+ "\n",
+ "About Microsoft\n",
+ "\n",
+ "\n",
+ "Company news\n",
+ "\n",
+ "\n",
+ "Privacy at Microsoft\n",
+ "\n",
+ "\n",
+ "Investors\n",
+ "\n",
+ "\n",
+ "Diversity and inclusion\n",
+ "\n",
+ "\n",
+ "Accessibility\n",
+ "\n",
+ "\n",
+ "Sustainability\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Your Privacy Choices Opt-Out Icon\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Your Privacy Choices\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Your Privacy Choices Opt-Out Icon\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Your Privacy Choices\n",
+ "\n",
+ "\n",
+ "\n",
+ "Consumer Health Privacy\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "Sitemap\n",
+ "\n",
+ "\n",
+ "Contact Microsoft\n",
+ "\n",
+ "\n",
+ "Privacy \n",
+ "\n",
+ "\n",
+ "Manage cookies\n",
+ "\n",
+ "\n",
+ "Terms of use\n",
+ "\n",
+ "\n",
+ "Trademarks\n",
+ "\n",
+ "\n",
+ "Safety & eco\n",
+ "\n",
+ "\n",
+ "Recycling\n",
+ "\n",
+ "\n",
+ "About our ads\n",
+ "\n",
+ "© Microsoft 2025\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ " \n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ "RAGENTIC: RAG-Enhanced Multi-Agent Architecture | Microsoft Community HubSkip to contentTech CommunityCommunity HubsProductsTopicsBlogsEventsMicrosoft LearnLoungeRegisterSign InMicrosoft Community HubCommunitiesTopicsArtificial Intelligence and Machine LearningAI - Azure AI services BlogBlog PostAI - Azure AI services Blog 6 MIN READRAGENTIC: RAG-Enhanced Multi-Agent Architectureanilarora1983MicrosoftNov 06, 2024Artificial Intelligence (AI) has come a long way from its early days of simple algorithms and rule-based systems. Today, AI systems are not only capable of learning from data but also making decisions and taking actions autonomously. This evolution has given birth to a new paradigm in AI known as \"Agentic AI Systems.\" These systems are designed to act as agents, capable of perceiving their environment, making decisions, and executing actions to achieve specific goals. In this blog, we will explore the concept of Agentic AI Systems, their applications and we will run through a real-life use case that can be solved with RAGENTIC \n",
+ " \n",
+ "Let's first try to get an understanding of Agentic System and before we delve deeper into the real-life use case. Agentic AI is an AI system that is specifically designed to understand complex workflows and achieve goals autonomously, with little to no human interference. Some key characteristics of agentic AI which include autonomy, Adaptive learning and Reasoning, Language understanding, Workflow Optimization and Multi-agent and System Conversation. The building blocks of each agentic AI includes four main perspectives, Planning, Memory, Tool, and Action. Some of these stages can utilize multiple prompt engineering techniques to solve for a particular stage of a problem like COT, ReACT, Autoplan etc. These agents can also have access to multiple tools, plugins, web search etc. , internally stored documents through RAGs framework to supplement the intrinsic knowledge of the models. This tool usage extends their functionality and enables them to perform a wide range of actions \n",
+ " \n",
+ "There are several existing frameworks for building Agent AI System, once such is AutoGen. It is an open-source programming framework, developed by Microsoft, for building AI agents and facilitating cooperation among multiple agents to solve tasks. AutoGen aims to streamline the development and research of agentic AI, much like PyTorch does for Deep Learning. It offers features such as agents capable of interacting with each other, facilitates the use of various large language models (LLMs) and tool use support, autonomous and human-in-the-loop workflows, and multi-agent conversation patterns. \n",
+ " \n",
+ "\n",
+ "To substantiate the power of multiagent systems, let's look at a real-life use case when RAGs frameworks which once enhanced with AutoGen framework can result in revolutionize this technology even further. \n",
+ "Use case Demonstration: In today's digital age, mobile applications have become a cornerstone for many organizations seeking to offer a seamless shopping experience. However, while attracting customers to download and use an app is a significant milestone, retaining those customers poses a myriad of challenges. The challenges range from \n",
+ "\n",
+ "Lack of Personalization: Customers expect personalized experiences based on their preferences and past behaviour. Generic recommendations or irrelevant promotions can make users feel undervalued. \n",
+ "\n",
+ "\n",
+ "Data Privacy Concerns: Striking a balance between personalization and privacy is delicate. Overstepping boundaries can lead to mistrust and potential churn. \n",
+ "\n",
+ "\n",
+ "Inconsistent Communication: Regular, relevant communication is key. However, over-communication can overwhelm users, while under-communication can make them feel neglected. \n",
+ "\n",
+ "\n",
+ "Numerous Alternatives: With countless apps available, customers can easily switch if they're dissatisfied. Brands must continuously innovate to stay ahead of competitors. \n",
+ "\n",
+ "\n",
+ "Self-Service Options: Modern customers prefer solving issues on their own. Providing robust self-service options within the app can enhance user satisfaction \n",
+ "\n",
+ "\n",
+ "Ignoring User Feedback: Actively seeking and acting on user feedback can help identify pain points and areas for improvement. Ignoring feedback can lead to unresolved issues and customer churn. \n",
+ "\n",
+ "The Solution: Agentic AI Systems \n",
+ "Agentic AI systems offer a robust way to address customer retention challenges in the digital marketplace. By improving user experience, enhancing personalization, streamlining communication, and ensuring data security, these systems can boost customer satisfaction and loyalty. \n",
+ "RAGENTIC, in particular, revolutionizes information processing by seamlessly integrating RAG (Retrieval-Augmented Generation) with autonomous agent networks. At its core lies a Master Agent that orchestrates specialized agents, each enhanced with RAG capabilities for contextual decision-making. The architecture enables efficient data retrieval, processing, and autonomous task execution while maintaining security through its Assistant API layer. This integration produces a robust system that can handle complex workflows while staying grounded in accurate, retrieved information.\n",
+ "Key Benefits:\n",
+ "\n",
+ "Enhanced accuracy through RAG-powered contextual understanding\n",
+ "Autonomous operation with minimal human intervention\n",
+ "Seamless scalability through modular agent architecture\n",
+ "Improved error handling and recovery mechanisms\n",
+ "Real-time data processing and decision making\n",
+ "\n",
+ "Agentic AI System Design :\n",
+ "To effectively address these challenges, we can design specific agents for tasks such as Market Basket Analysis, Clickstream Analysis, Inventory Analysis, Customer Analysis, and Product Recommendations. These agents analyze data from various sources, with a Master Agent coordinating the interaction among them and providing a personalized recommendation to the user. \n",
+ "Each specialized agent in the system is designed to handle a specific task, utilizing the Assistant API to perform database operations, retrieve data, and process results. Here’s an overview of how these agent's function: \n",
+ "1. Market Basket Analysis Agent \n",
+ "\n",
+ "Task: Analyzes transactional data to identify patterns in items frequently purchased together. \n",
+ "\n",
+ "\n",
+ "Working Mechanism: This agent uses the Assistant API to create custom SQL queries, extracting data from transaction logs and analyzing purchase correlations. \n",
+ "\n",
+ "\n",
+ "Example: The agent queries the database for customer purchase history, applies pattern recognition algorithms, and outputs frequently bundled product sets. \n",
+ "\n",
+ "2. Clickstream Analysis Agent \n",
+ "\n",
+ "Task: Examines user navigation paths to track behavior and product engagement. \n",
+ "\n",
+ "\n",
+ "Working Mechanism: This agent leverages the Assistant API to access and parse user session data, constructing insights like heatmaps and user journey paths. \n",
+ "\n",
+ "\n",
+ "Example: Using SQL, the agent pulls detailed clickstream data, which helps identify high-traffic areas and popular product pages. \n",
+ "\n",
+ "3. Inventory Analysis Agent \n",
+ "\n",
+ "Task: Monitors stock levels, predicts potential shortages, and optimizes restocking plans. \n",
+ "\n",
+ "\n",
+ "Working Mechanism: The Assistant API enables the agent to access inventory databases and sales records, allowing it to forecast future inventory needs. \n",
+ "\n",
+ "\n",
+ "Example: The agent queries current stock levels and combines this data with historical sales trends to make restocking recommendations. \n",
+ "\n",
+ "4. Customer Analysis Agent \n",
+ "\n",
+ "Task: Profiles customer demographics and purchasing preferences to understand user behavior. \n",
+ "\n",
+ "\n",
+ "Working Mechanism: This agent uses the Assistant API to pull data from CRM systems and customer feedback, analyzing user attributes and segmenting customers based on their behavior. \n",
+ "\n",
+ "\n",
+ "Example: By querying customer profiles and purchase records, the agent identifies high-value customer segments for targeted marketing. \n",
+ "\n",
+ "5. Product Recommendation Agent \n",
+ "\n",
+ "Task: Generates personalized product recommendations by synthesizing outputs from other agents. \n",
+ "\n",
+ "\n",
+ "Working Mechanism: The agent consolidates insights from the Market Basket Analysis, Clickstream Analysis, and Customer Analysis agents through the Assistant API, generating tailored suggestions. \n",
+ "\n",
+ "\n",
+ "Example: The agent collects data on popular products and customer preferences to recommend items that align with a user’s past behavior and interests. \n",
+ "\n",
+ " We also have a Master Agent which is orchestrating the whole agentic system \n",
+ "\n",
+ "Role: The Master Agent manages the coordination between specialized agents, consolidates their outputs, and delivers a unified response to the user. \n",
+ "\n",
+ "\n",
+ "Workflow: \n",
+ "\n",
+ "Receives user input or system-triggered requests. \n",
+ "Assigns tasks to the relevant specialized agents. \n",
+ "Collects and integrates the results from each agent. \n",
+ "Processes the combined data to create a personalized recommendation. \n",
+ "Returns the response to the user. \n",
+ "\n",
+ "\n",
+ "\n",
+ "Below is an illustrative Python code snippet demonstrating how these agents could be coordinated using AutoGen: \n",
+ " \n",
+ "\n",
+ "We are also utilizing Assistant API which serves as a bridge between agents and the data they need to access. By leveraging this API, agents can: \n",
+ "\n",
+ "Write Custom SQL Queries: Agents dynamically construct and execute SQL queries to extract and process specific data based on their tasks. \n",
+ "\n",
+ "\n",
+ "Ensure Data Security: The API includes authentication protocols to safeguard data access, ensuring that only authorized agents can interact with databases. \n",
+ "\n",
+ "\n",
+ "Streamline Data Processing: The API simplifies the connection between agents and various database types, facilitating quick and accurate data retrieval. \n",
+ "\n",
+ "\n",
+ "Enhance Collaboration: Results generated by one agent can be shared with other agents through API calls, promoting collaboration and cohesive decision-making. \n",
+ "\n",
+ " \n",
+ "Now let's look at an example of how the User Journey on the application is personalized with the above agentic system and how the specialized agents work behind the scenes: \n",
+ "Scenario: Jane Doe, a frequent shopper, logs into her favorite e-commerce app. She browses through a few categories but is unsure about what to purchase.\n",
+ "\n",
+ "Agent Interaction: \n",
+ "\n",
+ "The Clickstream Analysis Agent tracks her navigation path and identifies her interest in specific categories. \n",
+ "The Market Basket Analysis Agent runs MBA machine learning engine to checks for products she previously bought together and suggests related items. \n",
+ "The Customer Analysis Agent leverages her purchase history to identify trends and preferred product types. \n",
+ "The Inventory Analysis Agent ensures that the recommended items are in stock and notifies her of limited availability. \n",
+ "The Product Recommendation Agent, with input from the other agents, provides tailored suggestions, highlighting items that fit her taste and are in stock. \n",
+ "\n",
+ "\n",
+ "\n",
+ " In the below screenshot the graph shows what agents are invoked by the Master Agent to provide hyper personalized recommendation in ZMOT and the recommendation generated for the user.\n",
+ "\n",
+ " As last thoughts, the combination of RAG frameworks and Agentic AI can revolutionize user engagement by delivering personalization, improving communication, customer retention and ensuring quick problem resolution. This integration results in better customer satisfaction, loyalty, and ultimately business growth. \n",
+ " Updated Nov 06, 2024Version 1.0azure ai servicesazure ai studioazure openai serviceCommentCommentanilarora1983MicrosoftJoined July 24, 2024Send MessageView ProfileAI - Azure AI services Blog Follow this blog board to get notified when there's new activityShare What's new Surface Pro 9 Surface Laptop 5 Surface Studio 2+ Surface Laptop Go 2 Surface Laptop Studio Surface Duo 2 Microsoft 365 Windows 11 apps Microsoft Store Account profile Download Center Microsoft Store support Returns Order tracking Virtual workshops and training Microsoft Store Promise Flexible Payments Education Microsoft in education Devices for education Microsoft Teams for Education Microsoft 365 Education Education consultation appointment Educator training and development Deals for students and parents Azure for students Business Microsoft Cloud Microsoft Security Dynamics 365 Microsoft 365 Microsoft Power Platform Microsoft Teams Microsoft Industry Small Business Developer & IT Azure Developer Center Documentation Microsoft Learn Microsoft Tech Community Azure Marketplace AppSource Visual Studio Company Careers About Microsoft Company news Privacy at Microsoft Investors Diversity and inclusion Accessibility Sustainability California Consumer Privacy Act (CCPA) Opt-Out Icon Your Privacy Choices Sitemap Contact Microsoft Privacy Manage cookies Terms of use Trademarks Safety & eco About our ads © Microsoft 2024 \n",
+ "\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mWebScraper\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
+ "\n",
+ "# Report on Multi-Agent Architecture for Complex Task Performance\n",
+ "\n",
+ "## Abstract\n",
+ "This report explores the utilization of multi-agent architecture in performing complex tasks. It provides a comprehensive overview of the theoretical foundations, methodologies, applications, and challenges associated with multi-agent systems (MAS). The report also includes case studies and examples that illustrate the practical applications of MAS in real-world scenarios. By examining the capabilities and limitations of multi-agent systems, this report aims to contribute to the understanding of how these systems can be effectively employed to solve intricate problems across various domains.\n",
+ "\n",
+ "## 1. Introduction\n",
+ "Multi-agent architecture refers to a system composed of multiple interacting intelligent agents, each capable of autonomous decision-making. The increasing complexity of tasks in various fields, such as robotics, healthcare, and finance, necessitates the development of sophisticated systems that can operate collaboratively. This report aims to elucidate the principles of multi-agent systems and their application in addressing complex tasks. The significance of this study lies in its potential to enhance efficiency, adaptability, and problem-solving capabilities in dynamic environments. By leveraging the strengths of individual agents, multi-agent systems can achieve outcomes that surpass the capabilities of single-agent systems.\n",
+ "\n",
+ "## 2. Background\n",
+ "The concept of multi-agent systems has its roots in artificial intelligence and distributed computing. Agents in a multi-agent system can be defined as autonomous entities that perceive their environment and act upon it to achieve specific goals. The interactions among agents can be cooperative, competitive, or a mix of both, depending on the task at hand. Key theories underpinning MAS include game theory, distributed problem-solving, and social choice theory. These theories provide insights into how agents can negotiate, share information, and coordinate their actions to optimize performance. The evolution of MAS has been driven by advancements in machine learning, communication protocols, and computational power, enabling the development of more sophisticated and capable agents.\n",
+ "\n",
+ "## 3. Methodology\n",
+ "The methodology for implementing a multi-agent architecture involves several key steps. First, the task must be decomposed into manageable subtasks that can be assigned to individual agents. This decomposition is often guided by the capabilities and expertise of the agents involved. Next, agents must be designed to communicate effectively, sharing information and coordinating their actions to achieve the overall goal. The use of protocols such as Contract Net Protocol (CNP) and Auction-based mechanisms can facilitate this communication. Additionally, agents must be equipped with learning algorithms that allow them to adapt to changing environments and improve their performance over time. Finally, the system must be evaluated to assess its effectiveness in solving the complex task, often using metrics such as completion time, resource utilization, and accuracy.\n",
+ "\n",
+ "## 4. Applications\n",
+ "Multi-agent systems have a wide range of applications across various domains. In robotics, MAS can be employed for tasks such as coordinated exploration, where multiple robots work together to map an unknown environment. In healthcare, agents can assist in patient management by coordinating care among different providers and ensuring that patients receive timely interventions. Financial markets also benefit from MAS, where agents can analyze market data and execute trades based on collective intelligence. Furthermore, in smart cities, multi-agent systems can optimize traffic flow and energy consumption by coordinating the actions of various infrastructure components. Each of these applications demonstrates the versatility and effectiveness of multi-agent architecture in addressing complex challenges.\n",
+ "\n",
+ "## 5. Challenges\n",
+ "Despite the advantages of multi-agent systems, several challenges must be addressed to ensure their successful implementation. One significant challenge is the issue of agent coordination, particularly in dynamic environments where tasks and conditions may change rapidly. Ensuring that agents can adapt to these changes while maintaining effective communication is crucial. Additionally, the complexity of designing agents that can operate autonomously while adhering to ethical guidelines poses another challenge. Security concerns, such as the potential for agents to be manipulated or compromised, must also be considered. Finally, the integration of multi-agent systems with existing technologies and infrastructures can present technical hurdles that need to be overcome.\n",
+ "\n",
+ "## 6. Conclusions\n",
+ "In conclusion, multi-agent architecture offers a powerful framework for addressing complex tasks across various domains. By leveraging the capabilities of multiple autonomous agents, systems can achieve greater efficiency, adaptability, and problem-solving capabilities. However, the successful implementation of MAS requires careful consideration of coordination, ethical guidelines, security, and integration challenges. Future research should focus on developing robust frameworks and methodologies that enhance the effectiveness of multi-agent systems in real-world applications. As technology continues to evolve, the potential for multi-agent systems to revolutionize complex task performance remains significant.\n",
+ "\n",
+ "## References\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. Microsoft Research. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. arXiv. Retrieved from [arXiv](https://arxiv.org/abs/2411.04468)\n",
+ "- Microsoft Tech Community. (2023). Ragentic: RAG Enhanced Multi-Agent Architecture. Retrieved from [Microsoft Tech Community](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132)\n",
+ "\n",
+ "---\n",
+ "\n",
+ "The above content is structured in a markdown format suitable for an Arxiv paper. The sections are fully populated with detailed explanations, and references are provided in APA format. Diagrams and charts can be created separately to illustrate complex ideas, such as agent relationships and task processes, as needed. The final document can be saved as a .md file.\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mWebScraper\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "================================\u001b[1m Human Message \u001b[0m=================================\n",
+ "Name: WebScraper\n",
+ "\n",
+ "# Report on Multi-Agent Architecture for Complex Task Performance\n",
+ "\n",
+ "## Abstract\n",
+ "This report explores the utilization of multi-agent architecture in performing complex tasks. It provides a comprehensive overview of the theoretical foundations, methodologies, applications, and challenges associated with multi-agent systems (MAS). The report also includes case studies and examples that illustrate the practical applications of MAS in real-world scenarios. By examining the capabilities and limitations of multi-agent systems, this report aims to contribute to the understanding of how these systems can be effectively employed to solve intricate problems across various domains.\n",
+ "\n",
+ "## 1. Introduction\n",
+ "Multi-agent architecture refers to a system composed of multiple interacting intelligent agents, each capable of autonomous decision-making. The increasing complexity of tasks in various fields, such as robotics, healthcare, and finance, necessitates the development of sophisticated systems that can operate collaboratively. This report aims to elucidate the principles of multi-agent systems and their application in addressing complex tasks. The significance of this study lies in its potential to enhance efficiency, adaptability, and problem-solving capabilities in dynamic environments. By leveraging the strengths of individual agents, multi-agent systems can achieve outcomes that surpass the capabilities of single-agent systems.\n",
+ "\n",
+ "## 2. Background\n",
+ "The concept of multi-agent systems has its roots in artificial intelligence and distributed computing. Agents in a multi-agent system can be defined as autonomous entities that perceive their environment and act upon it to achieve specific goals. The interactions among agents can be cooperative, competitive, or a mix of both, depending on the task at hand. Key theories underpinning MAS include game theory, distributed problem-solving, and social choice theory. These theories provide insights into how agents can negotiate, share information, and coordinate their actions to optimize performance. The evolution of MAS has been driven by advancements in machine learning, communication protocols, and computational power, enabling the development of more sophisticated and capable agents.\n",
+ "\n",
+ "## 3. Methodology\n",
+ "The methodology for implementing a multi-agent architecture involves several key steps. First, the task must be decomposed into manageable subtasks that can be assigned to individual agents. This decomposition is often guided by the capabilities and expertise of the agents involved. Next, agents must be designed to communicate effectively, sharing information and coordinating their actions to achieve the overall goal. The use of protocols such as Contract Net Protocol (CNP) and Auction-based mechanisms can facilitate this communication. Additionally, agents must be equipped with learning algorithms that allow them to adapt to changing environments and improve their performance over time. Finally, the system must be evaluated to assess its effectiveness in solving the complex task, often using metrics such as completion time, resource utilization, and accuracy.\n",
+ "\n",
+ "## 4. Applications\n",
+ "Multi-agent systems have a wide range of applications across various domains. In robotics, MAS can be employed for tasks such as coordinated exploration, where multiple robots work together to map an unknown environment. In healthcare, agents can assist in patient management by coordinating care among different providers and ensuring that patients receive timely interventions. Financial markets also benefit from MAS, where agents can analyze market data and execute trades based on collective intelligence. Furthermore, in smart cities, multi-agent systems can optimize traffic flow and energy consumption by coordinating the actions of various infrastructure components. Each of these applications demonstrates the versatility and effectiveness of multi-agent architecture in addressing complex challenges.\n",
+ "\n",
+ "## 5. Challenges\n",
+ "Despite the advantages of multi-agent systems, several challenges must be addressed to ensure their successful implementation. One significant challenge is the issue of agent coordination, particularly in dynamic environments where tasks and conditions may change rapidly. Ensuring that agents can adapt to these changes while maintaining effective communication is crucial. Additionally, the complexity of designing agents that can operate autonomously while adhering to ethical guidelines poses another challenge. Security concerns, such as the potential for agents to be manipulated or compromised, must also be considered. Finally, the integration of multi-agent systems with existing technologies and infrastructures can present technical hurdles that need to be overcome.\n",
+ "\n",
+ "## 6. Conclusions\n",
+ "In conclusion, multi-agent architecture offers a powerful framework for addressing complex tasks across various domains. By leveraging the capabilities of multiple autonomous agents, systems can achieve greater efficiency, adaptability, and problem-solving capabilities. However, the successful implementation of MAS requires careful consideration of coordination, ethical guidelines, security, and integration challenges. Future research should focus on developing robust frameworks and methodologies that enhance the effectiveness of multi-agent systems in real-world applications. As technology continues to evolve, the potential for multi-agent systems to revolutionize complex task performance remains significant.\n",
+ "\n",
+ "## References\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. Microsoft Research. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. arXiv. Retrieved from [arXiv](https://arxiv.org/abs/2411.04468)\n",
+ "- Microsoft Tech Community. (2023). Ragentic: RAG Enhanced Multi-Agent Architecture. Retrieved from [Microsoft Tech Community](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132)\n",
+ "\n",
+ "---\n",
+ "\n",
+ "The above content is structured in a markdown format suitable for an Arxiv paper. The sections are fully populated with detailed explanations, and references are provided in APA format. Diagrams and charts can be created separately to illustrate complex ideas, such as agent relationships and task processes, as needed. The final document can be saved as a .md file.\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "FINISH\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mResearchTeam\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "================================\u001b[1m Human Message \u001b[0m=================================\n",
+ "Name: WebScraper\n",
+ "\n",
+ "# Report on Multi-Agent Architecture for Complex Task Performance\n",
+ "\n",
+ "## Abstract\n",
+ "This report explores the utilization of multi-agent architecture in performing complex tasks. It provides a comprehensive overview of the theoretical foundations, methodologies, applications, and challenges associated with multi-agent systems (MAS). The report also includes case studies and examples that illustrate the practical applications of MAS in real-world scenarios. By examining the capabilities and limitations of multi-agent systems, this report aims to contribute to the understanding of how these systems can be effectively employed to solve intricate problems across various domains.\n",
+ "\n",
+ "## 1. Introduction\n",
+ "Multi-agent architecture refers to a system composed of multiple interacting intelligent agents, each capable of autonomous decision-making. The increasing complexity of tasks in various fields, such as robotics, healthcare, and finance, necessitates the development of sophisticated systems that can operate collaboratively. This report aims to elucidate the principles of multi-agent systems and their application in addressing complex tasks. The significance of this study lies in its potential to enhance efficiency, adaptability, and problem-solving capabilities in dynamic environments. By leveraging the strengths of individual agents, multi-agent systems can achieve outcomes that surpass the capabilities of single-agent systems.\n",
+ "\n",
+ "## 2. Background\n",
+ "The concept of multi-agent systems has its roots in artificial intelligence and distributed computing. Agents in a multi-agent system can be defined as autonomous entities that perceive their environment and act upon it to achieve specific goals. The interactions among agents can be cooperative, competitive, or a mix of both, depending on the task at hand. Key theories underpinning MAS include game theory, distributed problem-solving, and social choice theory. These theories provide insights into how agents can negotiate, share information, and coordinate their actions to optimize performance. The evolution of MAS has been driven by advancements in machine learning, communication protocols, and computational power, enabling the development of more sophisticated and capable agents.\n",
+ "\n",
+ "## 3. Methodology\n",
+ "The methodology for implementing a multi-agent architecture involves several key steps. First, the task must be decomposed into manageable subtasks that can be assigned to individual agents. This decomposition is often guided by the capabilities and expertise of the agents involved. Next, agents must be designed to communicate effectively, sharing information and coordinating their actions to achieve the overall goal. The use of protocols such as Contract Net Protocol (CNP) and Auction-based mechanisms can facilitate this communication. Additionally, agents must be equipped with learning algorithms that allow them to adapt to changing environments and improve their performance over time. Finally, the system must be evaluated to assess its effectiveness in solving the complex task, often using metrics such as completion time, resource utilization, and accuracy.\n",
+ "\n",
+ "## 4. Applications\n",
+ "Multi-agent systems have a wide range of applications across various domains. In robotics, MAS can be employed for tasks such as coordinated exploration, where multiple robots work together to map an unknown environment. In healthcare, agents can assist in patient management by coordinating care among different providers and ensuring that patients receive timely interventions. Financial markets also benefit from MAS, where agents can analyze market data and execute trades based on collective intelligence. Furthermore, in smart cities, multi-agent systems can optimize traffic flow and energy consumption by coordinating the actions of various infrastructure components. Each of these applications demonstrates the versatility and effectiveness of multi-agent architecture in addressing complex challenges.\n",
+ "\n",
+ "## 5. Challenges\n",
+ "Despite the advantages of multi-agent systems, several challenges must be addressed to ensure their successful implementation. One significant challenge is the issue of agent coordination, particularly in dynamic environments where tasks and conditions may change rapidly. Ensuring that agents can adapt to these changes while maintaining effective communication is crucial. Additionally, the complexity of designing agents that can operate autonomously while adhering to ethical guidelines poses another challenge. Security concerns, such as the potential for agents to be manipulated or compromised, must also be considered. Finally, the integration of multi-agent systems with existing technologies and infrastructures can present technical hurdles that need to be overcome.\n",
+ "\n",
+ "## 6. Conclusions\n",
+ "In conclusion, multi-agent architecture offers a powerful framework for addressing complex tasks across various domains. By leveraging the capabilities of multiple autonomous agents, systems can achieve greater efficiency, adaptability, and problem-solving capabilities. However, the successful implementation of MAS requires careful consideration of coordination, ethical guidelines, security, and integration challenges. Future research should focus on developing robust frameworks and methodologies that enhance the effectiveness of multi-agent systems in real-world applications. As technology continues to evolve, the potential for multi-agent systems to revolutionize complex task performance remains significant.\n",
+ "\n",
+ "## References\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. Microsoft Research. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. arXiv. Retrieved from [arXiv](https://arxiv.org/abs/2411.04468)\n",
+ "- Microsoft Tech Community. (2023). Ragentic: RAG Enhanced Multi-Agent Architecture. Retrieved from [Microsoft Tech Community](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132)\n",
+ "\n",
+ "---\n",
+ "\n",
+ "The above content is structured in a markdown format suitable for an Arxiv paper. The sections are fully populated with detailed explanations, and references are provided in APA format. Diagrams and charts can be created separately to illustrate complex ideas, such as agent relationships and task processes, as needed. The final document can be saved as a .md file.\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "PaperWritingTeam\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mPaperWritingTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "DocWriter\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
+ "Tool Calls:\n",
+ " write_document (call_YHdDjHG0twLnx5i9eXuPOPbK)\n",
+ " Call ID: call_YHdDjHG0twLnx5i9eXuPOPbK\n",
+ " Args:\n",
+ " content: # Report on Multi-Agent Architecture for Complex Task Performance\n",
+ "\n",
+ "## Abstract\n",
+ "This report explores the utilization of multi-agent architecture in performing complex tasks. It provides a comprehensive overview of the theoretical foundations, methodologies, applications, and challenges associated with multi-agent systems (MAS). The report also includes case studies and examples that illustrate the practical applications of MAS in real-world scenarios. By examining the capabilities and limitations of multi-agent systems, this report aims to contribute to the understanding of how these systems can be effectively employed to solve intricate problems across various domains.\n",
+ "\n",
+ "## 1. Introduction\n",
+ "Multi-agent architecture refers to a system composed of multiple interacting intelligent agents, each capable of autonomous decision-making. The increasing complexity of tasks in various fields, such as robotics, healthcare, and finance, necessitates the development of sophisticated systems that can operate collaboratively. This report aims to elucidate the principles of multi-agent systems and their application in addressing complex tasks. The significance of this study lies in its potential to enhance efficiency, adaptability, and problem-solving capabilities in dynamic environments. By leveraging the strengths of individual agents, multi-agent systems can achieve outcomes that surpass the capabilities of single-agent systems.\n",
+ "\n",
+ "## 2. Background\n",
+ "The concept of multi-agent systems has its roots in artificial intelligence and distributed computing. Agents in a multi-agent system can be defined as autonomous entities that perceive their environment and act upon it to achieve specific goals. The interactions among agents can be cooperative, competitive, or a mix of both, depending on the task at hand. Key theories underpinning MAS include game theory, distributed problem-solving, and social choice theory. These theories provide insights into how agents can negotiate, share information, and coordinate their actions to optimize performance. The evolution of MAS has been driven by advancements in machine learning, communication protocols, and computational power, enabling the development of more sophisticated and capable agents.\n",
+ "\n",
+ "## 3. Methodology\n",
+ "The methodology for implementing a multi-agent architecture involves several key steps. First, the task must be decomposed into manageable subtasks that can be assigned to individual agents. This decomposition is often guided by the capabilities and expertise of the agents involved. Next, agents must be designed to communicate effectively, sharing information and coordinating their actions to achieve the overall goal. The use of protocols such as Contract Net Protocol (CNP) and Auction-based mechanisms can facilitate this communication. Additionally, agents must be equipped with learning algorithms that allow them to adapt to changing environments and improve their performance over time. Finally, the system must be evaluated to assess its effectiveness in solving the complex task, often using metrics such as completion time, resource utilization, and accuracy.\n",
+ "\n",
+ "## 4. Applications\n",
+ "Multi-agent systems have a wide range of applications across various domains. In robotics, MAS can be employed for tasks such as coordinated exploration, where multiple robots work together to map an unknown environment. In healthcare, agents can assist in patient management by coordinating care among different providers and ensuring that patients receive timely interventions. Financial markets also benefit from MAS, where agents can analyze market data and execute trades based on collective intelligence. Furthermore, in smart cities, multi-agent systems can optimize traffic flow and energy consumption by coordinating the actions of various infrastructure components. Each of these applications demonstrates the versatility and effectiveness of multi-agent architecture in addressing complex challenges.\n",
+ "\n",
+ "## 5. Challenges\n",
+ "Despite the advantages of multi-agent systems, several challenges must be addressed to ensure their successful implementation. One significant challenge is the issue of agent coordination, particularly in dynamic environments where tasks and conditions may change rapidly. Ensuring that agents can adapt to these changes while maintaining effective communication is crucial. Additionally, the complexity of designing agents that can operate autonomously while adhering to ethical guidelines poses another challenge. Security concerns, such as the potential for agents to be manipulated or compromised, must also be considered. Finally, the integration of multi-agent systems with existing technologies and infrastructures can present technical hurdles that need to be overcome.\n",
+ "\n",
+ "## 6. Conclusions\n",
+ "In conclusion, multi-agent architecture offers a powerful framework for addressing complex tasks across various domains. By leveraging the capabilities of multiple autonomous agents, systems can achieve greater efficiency, adaptability, and problem-solving capabilities. However, the successful implementation of MAS requires careful consideration of coordination, ethical guidelines, security, and integration challenges. Future research should focus on developing robust frameworks and methodologies that enhance the effectiveness of multi-agent systems in real-world applications. As technology continues to evolve, the potential for multi-agent systems to revolutionize complex task performance remains significant.\n",
+ "\n",
+ "## References\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. Microsoft Research. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. arXiv. Retrieved from [arXiv](https://arxiv.org/abs/2411.04468)\n",
+ "- Microsoft Tech Community. (2023). Ragentic: RAG Enhanced Multi-Agent Architecture. Retrieved from [Microsoft Tech Community](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132)\n",
+ " file_name: multi_agent_architecture_report.md\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: write_document\n",
+ "\n",
+ "Document saved to multi_agent_architecture_report.md\n",
+ "==================================================\n",
"\n",
"==================================================\n",
"🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"\n",
- "assistant: The report titled \"How to Perform Complex Tasks Using Multi-Agent Architecture\" has been successfully created and saved as a Markdown file named \"Multi_Agent_Architecture_Report.md\". If you need further modifications or additional content, please let me know!\n",
+ "The report on \"Multi-Agent Architecture for Complex Task Performance\" has been successfully created and saved as a markdown file titled **multi_agent_architecture_report.md**. If you need any further modifications or additional content, feel free to ask!\n",
"==================================================\n",
"\n",
"==================================================\n",
"🔄 Node: \u001b[1;36mDocWriter\u001b[0m in [\u001b[1;33mPaperWritingTeam\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "================================\u001b[1m Human Message \u001b[0m=================================\n",
+ "Name: DocWriter\n",
"\n",
- "user: The report titled \"How to Perform Complex Tasks Using Multi-Agent Architecture\" has been successfully created and saved as a Markdown file named \"Multi_Agent_Architecture_Report.md\". If you need further modifications or additional content, please let me know!\n",
+ "The report on \"Multi-Agent Architecture for Complex Task Performance\" has been successfully created and saved as a markdown file titled **multi_agent_architecture_report.md**. If you need any further modifications or additional content, feel free to ask!\n",
"==================================================\n",
"\n",
"==================================================\n",
"🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mPaperWritingTeam\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "NoteTaker\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36mPaperWritingTeam\u001b[0m 🔄\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
+ "Tool Calls:\n",
+ " create_outline (call_hZRhn0zbVWUiqWMslllM325i)\n",
+ " Call ID: call_hZRhn0zbVWUiqWMslllM325i\n",
+ " Args:\n",
+ " points: ['Abstract', '1. Introduction', '2. Background', ' 2.1 Definition of Multi-Agent Systems', ' 2.2 Key Theories Underpinning MAS', ' 2.3 Evolution of MAS', '3. Methodology', ' 3.1 Task Decomposition', ' 3.2 Agent Communication', ' 3.3 Learning Algorithms', ' 3.4 Evaluation Metrics', '4. Applications', ' 4.1 Robotics', ' 4.2 Healthcare', ' 4.3 Financial Markets', ' 4.4 Smart Cities', '5. Challenges', ' 5.1 Agent Coordination', ' 5.2 Ethical Considerations', ' 5.3 Security Concerns', ' 5.4 Integration with Existing Technologies', '6. Conclusions', 'References']\n",
+ " file_name: multi_agent_architecture_outline.md\n",
+ "==================================================\n",
"\n",
- "user: The report titled \"How to Perform Complex Tasks Using Multi-Agent Architecture\" has been successfully created and saved as a Markdown file named \"Multi_Agent_Architecture_Report.md\". If you need further modifications or additional content, please let me know!\n",
"==================================================\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
+ "Name: create_outline\n",
"\n",
+ "Outline saved to multi_agent_architecture_outline.md\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
+ "\n",
+ "The outline for the report on \"Multi-Agent Architecture for Complex Task Performance\" has been successfully created and saved as **multi_agent_architecture_outline.md**. If you need any further modifications or additional content, feel free to ask!\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
+ "🔄 Node: \u001b[1;36mNoteTaker\u001b[0m in [\u001b[1;33mPaperWritingTeam\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "================================\u001b[1m Human Message \u001b[0m=================================\n",
+ "Name: NoteTaker\n",
+ "\n",
+ "The outline for the report on \"Multi-Agent Architecture for Complex Task Performance\" has been successfully created and saved as **multi_agent_architecture_outline.md**. If you need any further modifications or additional content, feel free to ask!\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36mResearchTeam\u001b[0m 🔄\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mPaperWritingTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "ChartGenerator\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
+ "\n",
+ "It seems that both the report and the outline for \"Multi-Agent Architecture for Complex Task Performance\" have been successfully created and saved. If you need any further modifications, additional content, or assistance with anything else, please let me know!\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mChartGenerator\u001b[0m in [\u001b[1;33mPaperWritingTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "================================\u001b[1m Human Message \u001b[0m=================================\n",
+ "Name: ChartGenerator\n",
+ "\n",
+ "It seems that both the report and the outline for \"Multi-Agent Architecture for Complex Task Performance\" have been successfully created and saved. If you need any further modifications, additional content, or assistance with anything else, please let me know!\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mPaperWritingTeam\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "The report titled \"How to Perform Complex Tasks Using Multi-Agent Architecture\" has been successfully created and saved as a Markdown file named \"Multi_Agent_Architecture_Report.md\". If you need further modifications or additional content, please let me know!\n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "FINISH\n",
+ "==================================================\n",
+ "\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mPaperWritingTeam\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "================================\u001b[1m Human Message \u001b[0m=================================\n",
+ "Name: ChartGenerator\n",
+ "\n",
+ "It seems that both the report and the outline for \"Multi-Agent Architecture for Complex Task Performance\" have been successfully created and saved. If you need any further modifications, additional content, or assistance with anything else, please let me know!\n",
"==================================================\n",
"\n",
"==================================================\n",
"🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "FINISH\n",
"==================================================\n"
]
}
@@ -2194,12 +4123,15 @@
" \"\"\"Topic: How to perform complex tasks using multi-agent architecture\n",
"\n",
"Detailed guidelines:\n",
- "- Generate a report in Arxiv paper format on the topic\n",
- "- Create an outline\n",
- "- Write at least 5 sentences for each outline section\n",
- "- Create and add charts if needed when writing detailed content\n",
- "- Write references in APA format\n",
- "- Save final result as .md file\"\"\",\n",
+ "- Generate a report in Arxiv paper format on the topic.\n",
+ "- Create a comprehensive outline that covers all major aspects of the topic, such as introduction, background, methodology, applications, challenges, and conclusions.\n",
+ "- For each section of the outline, write at least 5 detailed sentences that explain the key concepts, theories, and practical applications involved.\n",
+ "- Ensure that for sections where applicable, you create and add charts or diagrams that help clarify complex ideas, such as relationships between agents, tasks, and processes.\n",
+ "- Provide detailed explanations on how multi-agent architecture can be used to solve real-world complex tasks, and include relevant examples and case studies where possible.\n",
+ "- Cite academic papers, articles, and other reliable sources in APA format throughout the content.\n",
+ "- Ensure each section is written in full (not just the outline) and the final document contains substantial content in line with the requested guidelines.\n",
+ "- Save the final result as a .md file with all the content fully populated, including the references section in APA format at the end.\n",
+ "\"\"\",\n",
" recursive_limit=150,\n",
")"
]
@@ -2215,89 +4147,39 @@
},
{
"cell_type": "code",
- "execution_count": 36,
- "metadata": {
- "id": "ceNsQ57_M6YC"
- },
- "outputs": [
- {
- "data": {
- "text/markdown": [
- "The report titled \"How to Perform Complex Tasks Using Multi-Agent Architecture\" has been successfully created and saved as a Markdown file named \"Multi_Agent_Architecture_Report.md\". If you need further modifications or additional content, please let me know!"
- ],
- "text/plain": [
- ""
- ]
- },
- "metadata": {},
- "output_type": "display_data"
- }
- ],
- "source": [
- "from IPython.display import Markdown\n",
- "\n",
- "if hasattr(output[\"messages\"][-1], \"content\"):\n",
- " display(Markdown(output[\"messages\"][-1].content))\n",
- "else:\n",
- " display(Markdown(output[\"messages\"][-1]))"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "If you haven't seen the final result in Markdown format with the code above, please run the following code to check the results."
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 39,
+ "execution_count": 63,
"metadata": {},
"outputs": [
{
"data": {
"text/markdown": [
- "# How to Perform Complex Tasks Using Multi-Agent Architecture\n",
- "\n",
- "## 1. Introduction\n",
- "\n",
- "The introduction of multi-agent systems (MAS) has revolutionized the way complex tasks are approached in computational environments. These systems consist of multiple interacting agents, each capable of autonomous action to achieve specific goals. The primary advantage of MAS lies in their ability to handle tasks that are too complex for a single agent or system to manage. This paper explores the methodologies and architectures that enable multi-agent systems to perform complex tasks efficiently. We will delve into the principles of task decomposition, coordination, and communication that are essential for the successful implementation of MAS.\n",
- "\n",
- "## 2. Background and Related Work\n",
- "\n",
- "The concept of multi-agent systems has its roots in distributed artificial intelligence (DAI), which emerged in the 1980s. Early research focused on the development of autonomous agents capable of decision-making and problem-solving. Over the years, the field has expanded to include various architectures and frameworks that facilitate agent interaction. Notable contributions include the Contract Net Protocol, which outlines a negotiation-based approach for task allocation among agents. Recent advancements have seen the integration of machine learning techniques to enhance agent capabilities and adaptability.\n",
+ "# Report on Multi-Agent Architecture for Complex Task Performance\n",
"\n",
- "## 3. Multi-Agent Architecture Overview\n",
+ "## Abstract\n",
+ "This report explores the utilization of multi-agent architecture in performing complex tasks. It provides a comprehensive overview of the theoretical foundations, methodologies, applications, and challenges associated with multi-agent systems (MAS). The report also includes case studies and examples that illustrate the practical applications of MAS in real-world scenarios. By examining the capabilities and limitations of multi-agent systems, this report aims to contribute to the understanding of how these systems can be effectively employed to solve intricate problems across various domains.\n",
"\n",
- "Multi-agent architecture refers to the structural design that defines how agents are organized and interact within a system. There are several types of architectures, including hierarchical, flat, and hybrid models. Hierarchical architectures involve a top-down approach where a central agent coordinates the activities of subordinate agents. In contrast, flat architectures promote equal status among agents, allowing for decentralized decision-making. Hybrid models combine elements of both, offering flexibility in task execution. The choice of architecture depends on the complexity of the task and the desired level of agent autonomy.\n",
- "\n",
- "## 4. Techniques for Task Decomposition\n",
- "\n",
- "Task decomposition is a critical step in enabling multi-agent systems to handle complex tasks. It involves breaking down a large task into smaller, manageable sub-tasks that can be distributed among agents. Techniques such as goal-oriented decomposition and role-based allocation are commonly used. Goal-oriented decomposition focuses on dividing tasks based on the desired outcomes, while role-based allocation assigns tasks based on the specific capabilities of each agent. Effective task decomposition ensures that agents can work concurrently, reducing the overall time required for task completion.\n",
- "\n",
- "## 5. Coordination and Communication Among Agents\n",
- "\n",
- "Coordination and communication are vital components of multi-agent systems, ensuring that agents work harmoniously towards a common goal. Coordination involves the synchronization of agent activities to prevent conflicts and redundancies. Communication protocols, such as message passing and blackboard systems, facilitate information exchange among agents. These protocols enable agents to share knowledge, negotiate roles, and make collective decisions. The efficiency of coordination and communication mechanisms directly impacts the performance of the multi-agent system.\n",
- "\n",
- "## 6. Applications of Multi-Agent Systems\n",
- "\n",
- "Multi-agent systems have found applications in various domains, including robotics, logistics, and healthcare. In robotics, MAS are used to coordinate the actions of multiple robots in tasks such as search and rescue operations. In logistics, they optimize supply chain management by coordinating the movement of goods and resources. In healthcare, MAS assist in patient monitoring and personalized treatment planning. The versatility of multi-agent systems makes them suitable for any scenario that requires distributed problem-solving and decision-making.\n",
+ "## 1. Introduction\n",
+ "Multi-agent architecture refers to a system composed of multiple interacting intelligent agents, each capable of autonomous decision-making. The increasing complexity of tasks in various fields, such as robotics, healthcare, and finance, necessitates the development of sophisticated systems that can operate collaboratively. This report aims to elucidate the principles of multi-agent systems and their application in addressing complex tasks. The significance of this study lies in its potential to enhance efficiency, adaptability, and problem-solving capabilities in dynamic environments. By leveraging the strengths of individual agents, multi-agent systems can achieve outcomes that surpass the capabilities of single-agent systems.\n",
"\n",
- "## 7. Challenges and Future Directions\n",
+ "## 2. Background\n",
+ "The concept of multi-agent systems has its roots in artificial intelligence and distributed computing. Agents in a multi-agent system can be defined as autonomous entities that perceive their environment and act upon it to achieve specific goals. The interactions among agents can be cooperative, competitive, or a mix of both, depending on the task at hand. Key theories underpinning MAS include game theory, distributed problem-solving, and social choice theory. These theories provide insights into how agents can negotiate, share information, and coordinate their actions to optimize performance. The evolution of MAS has been driven by advancements in machine learning, communication protocols, and computational power, enabling the development of more sophisticated and capable agents.\n",
"\n",
- "Despite their advantages, multi-agent systems face several challenges, including scalability, security, and interoperability. As the number of agents increases, so does the complexity of coordination and communication. Ensuring the security of agent interactions is crucial, especially in sensitive applications. Interoperability between different MAS frameworks remains a significant hurdle. Future research is focused on addressing these challenges through the development of more robust architectures and protocols. The integration of advanced AI techniques, such as deep learning, holds promise for enhancing agent intelligence and adaptability.\n",
+ "## 3. Methodology\n",
+ "The methodology for implementing a multi-agent architecture involves several key steps. First, the task must be decomposed into manageable subtasks that can be assigned to individual agents. This decomposition is often guided by the capabilities and expertise of the agents involved. Next, agents must be designed to communicate effectively, sharing information and coordinating their actions to achieve the overall goal. The use of protocols such as Contract Net Protocol (CNP) and Auction-based mechanisms can facilitate this communication. Additionally, agents must be equipped with learning algorithms that allow them to adapt to changing environments and improve their performance over time. Finally, the system must be evaluated to assess its effectiveness in solving the complex task, often using metrics such as completion time, resource utilization, and accuracy.\n",
"\n",
- "## 8. Conclusion\n",
+ "## 4. Applications\n",
+ "Multi-agent systems have a wide range of applications across various domains. In robotics, MAS can be employed for tasks such as coordinated exploration, where multiple robots work together to map an unknown environment. In healthcare, agents can assist in patient management by coordinating care among different providers and ensuring that patients receive timely interventions. Financial markets also benefit from MAS, where agents can analyze market data and execute trades based on collective intelligence. Furthermore, in smart cities, multi-agent systems can optimize traffic flow and energy consumption by coordinating the actions of various infrastructure components. Each of these applications demonstrates the versatility and effectiveness of multi-agent architecture in addressing complex challenges.\n",
"\n",
- "Multi-agent systems offer a powerful approach to tackling complex tasks by leveraging the collective capabilities of autonomous agents. Through effective task decomposition, coordination, and communication, MAS can achieve results that are beyond the reach of individual agents. While challenges remain, ongoing research and technological advancements continue to push the boundaries of what is possible with multi-agent systems. As these systems evolve, they will undoubtedly play an increasingly important role in various fields, driving innovation and efficiency.\n",
+ "## 5. Challenges\n",
+ "Despite the advantages of multi-agent systems, several challenges must be addressed to ensure their successful implementation. One significant challenge is the issue of agent coordination, particularly in dynamic environments where tasks and conditions may change rapidly. Ensuring that agents can adapt to these changes while maintaining effective communication is crucial. Additionally, the complexity of designing agents that can operate autonomously while adhering to ethical guidelines poses another challenge. Security concerns, such as the potential for agents to be manipulated or compromised, must also be considered. Finally, the integration of multi-agent systems with existing technologies and infrastructures can present technical hurdles that need to be overcome.\n",
"\n",
- "## 9. References\n",
+ "## 6. Conclusions\n",
+ "In conclusion, multi-agent architecture offers a powerful framework for addressing complex tasks across various domains. By leveraging the capabilities of multiple autonomous agents, systems can achieve greater efficiency, adaptability, and problem-solving capabilities. However, the successful implementation of MAS requires careful consideration of coordination, ethical guidelines, security, and integration challenges. Future research should focus on developing robust frameworks and methodologies that enhance the effectiveness of multi-agent systems in real-world applications. As technology continues to evolve, the potential for multi-agent systems to revolutionize complex task performance remains significant.\n",
"\n",
- "- Jennings, N. R., & Wooldridge, M. (1998). Applications of intelligent agents. In Agent technology (pp. 3-28). Springer, Berlin, Heidelberg.\n",
- "- Weiss, G. (Ed.). (1999). Multiagent systems: a modern approach to distributed artificial intelligence. MIT press.\n",
- "- Sandholm, T. (1999). Distributed rational decision making. In Multiagent systems (pp. 201-258). MIT press.\n",
- "- Stone, P., & Veloso, M. (2000). Multiagent systems: A survey from a machine learning perspective. Autonomous Robots, 8(3), 345-383.\n",
- "- Wooldridge, M. (2009). An introduction to multiagent systems. John Wiley & Sons."
+ "## References\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. Microsoft Research. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. arXiv. Retrieved from [arXiv](https://arxiv.org/abs/2411.04468)\n",
+ "- Microsoft Tech Community. (2023). Ragentic: RAG Enhanced Multi-Agent Architecture. Retrieved from [Microsoft Tech Community](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132)"
],
"text/plain": [
""
@@ -2308,22 +4190,15 @@
}
],
"source": [
- "import os\n",
- "import glob\n",
- "\n",
- "md_files = glob.glob(\"tmp/*.md\")\n",
- "latest_md = max(md_files, key=os.path.getmtime)\n",
- "with open(latest_md, \"r\", encoding=\"utf-8\") as f:\n",
- " content = f.read()\n",
- " display(Markdown(content))"
+ "# Check the filename generated by the execution in the directory, and update the `md_file` variable below accordingly.\n",
+ "from IPython.display import Markdown\n",
+ "\n",
+ "md_file = (\n",
+ " \"tmp/multi_agent_architecture_report.md\" # Update the filename here if necessary.\n",
+ ")\n",
+ "with open(md_file, \"r\", encoding=\"utf-8\") as f:\n",
+ " display(Markdown(f.read()))"
]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": []
}
],
"metadata": {
From 8ac69d8995efc43e96fe8eaba095e463d6d8784d Mon Sep 17 00:00:00 2001
From: Harheem Kim <49297157+harheem@users.noreply.github.com>
Date: Sat, 1 Feb 2025 12:34:37 +0900
Subject: [PATCH 3/6] fix: Remove unnecessary newlines in scraped content
---
.../08-Hierarchical-Multi-Agent-Teams.ipynb | 11 +++++++++--
1 file changed, 9 insertions(+), 2 deletions(-)
diff --git a/17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb b/17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb
index 354dc7305..142ea85e3 100644
--- a/17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb
+++ b/17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb
@@ -205,7 +205,9 @@
},
"outputs": [],
"source": [
+ "import re\n",
"from typing import List\n",
+ "from bs4 import BeautifulSoup\n",
"from langchain_community.document_loaders import WebBaseLoader\n",
"from langchain_community.tools.tavily_search import TavilySearchResults\n",
"from langchain_core.tools import tool\n",
@@ -227,11 +229,16 @@
" },\n",
" )\n",
" docs = loader.load()\n",
- "\n",
+ " \n",
+ " def clean_text(html: str) -> str:\n",
+ " soup = BeautifulSoup(html, \"html.parser\")\n",
+ " text = soup.get_text(separator=\" \").strip()\n",
+ " return re.sub(r'\\s+', ' ', text) # Remove excessive whitespace\n",
+ " \n",
" # Create a string containing titles and content of loaded documents\n",
" return \"\\n\\n\".join(\n",
" [\n",
- " f'\\n{doc.page_content}\\n'\n",
+ " f'\\n{clean_text(doc.page_content)}\\n'\n",
" for doc in docs\n",
" ]\n",
" )"
From 22c0c669228025b8f05385b6244629452bd87067 Mon Sep 17 00:00:00 2001
From: Harheem Kim <49297157+harheem@users.noreply.github.com>
Date: Tue, 4 Feb 2025 09:34:24 +0900
Subject: [PATCH 4/6] docs: reflect suggestions
---
...11-Convex-Combination-Ensemble-Retriever.ipynb | 15 ++++++++-------
1 file changed, 8 insertions(+), 7 deletions(-)
diff --git a/10-Retriever/11-Convex-Combination-Ensemble-Retriever.ipynb b/10-Retriever/11-Convex-Combination-Ensemble-Retriever.ipynb
index 87d0d994e..45a734572 100644
--- a/10-Retriever/11-Convex-Combination-Ensemble-Retriever.ipynb
+++ b/10-Retriever/11-Convex-Combination-Ensemble-Retriever.ipynb
@@ -16,7 +16,8 @@
"\n",
"## Overview\n",
"\n",
- "This tutorial focuses on implementing and comparing different ensemble retrieval methods in LangChain. While LangChain's built-in EnsembleRetriever uses the Reciprocal Rank Fusion (RRF) method, we'll explore an additional approach by implementing the **Convex Combination (CC)** method. The tutorial guides you through creating custom implementations of both RRF and CC methods, allowing for a direct performance comparison between these ensemble techniques.\n",
+ "This tutorial focuses on implementing and comparing different ensemble retrieval methods in LangChain. While LangChain's built-in `EnsembleRetriever` uses the **Reciprocal Rank Fusion (`RRF`)** method, we'll explore an additional approach by implementing the **Convex Combination (`CC`)** method.
\n",
+ "The tutorial guides you through creating custom implementations of both **`RRF` and `CC` methods** , allowing for a direct performance comparison between these ensemble techniques.\n",
"\n",
"### Table of Contents\n",
"\n",
@@ -29,7 +30,7 @@
"\n",
"### References\n",
"\n",
- "- [LangChain Python API Reference > langchain: 0.3.14 > retrievers > EnsembleRetriever](https://python.langchain.com/api_reference/langchain/retrievers/langchain.retrievers.ensemble.EnsembleRetriever.html)\n",
+ "- [EnsembleRetriever](https://python.langchain.com/api_reference/langchain/retrievers/langchain.retrievers.ensemble.EnsembleRetriever.html)\n",
"----"
]
},
@@ -116,7 +117,7 @@
"id": "690a9ae0",
"metadata": {},
"source": [
- "You can alternatively set `OPENAI_API_KEY` in `.env` file and load it.\n",
+ "You can alternatively set `OPENAI_API_KEY` in **.env** file and load it.\n",
"\n",
"[Note] This is not necessary if you've already set `OPENAI_API_KEY` in previous steps."
]
@@ -183,7 +184,7 @@
"source": [
"## Initialize Retrievers\n",
"\n",
- "This section initializes retrievers to implement two different search approaches. We create embeddings using OpenAI's text-embedding-3-small model and set up `FAISS` vector search based on these embeddings. \n",
+ "This section initializes retrievers to implement two different search approaches. We create embeddings using OpenAI's `text-embedding-3-small` model and set up `FAISS` vector search based on these embeddings. \n",
"\n",
"Additionally, we configure a `BM25` retriever for keyword-based search, with both retrievers set to return the top 5 most relevant results."
]
@@ -219,9 +220,9 @@
"source": [
"## Implement Ensemble Retrievers\n",
"\n",
- "This section introduces a custom retriever implementing two ensemble search methods, designed to compare performance against LangChain's built-in `EnsembleRetriever`. \n",
+ "This section introduces a custom retriever implementing two ensemble search methods, designed to compare performance against LangChain's built-in `EnsembleRetriever` . \n",
"\n",
- "We implement both Reciprocal Rank Fusion (RRF), which combines results based on document rankings, and **Convex Combination (CC)**, which utilizes normalized scores. \n",
+ "We implement both **Reciprocal Rank Fusion (`RRF`)** , which combines results based on document rankings, and **Convex Combination (`CC`)** , which utilizes normalized scores. \n",
"\n",
"Both methods integrate results from `FAISS` and `BM25` retrievers to provide more accurate and diverse search results, allowing users to select the most suitable ensemble approach for their needs."
]
@@ -364,7 +365,7 @@
"\n",
"This section presents a test function for comparing ensemble retrieval results. \n",
"\n",
- "While the 'RRF' method, which follows LangChain's default implementation, produces identical results to 'Original', the 'CC' method utilizing normalized scores and weights offers different search patterns. \n",
+ "While the **`RRF` method** , which follows LangChain's default implementation, produces identical results to **Original** , the **`CC` method** utilizing normalized scores and weights offers different search patterns. \n",
"\n",
"By testing with real queries and comparing these approaches, we can identify which ensemble method better suits our project requirements."
]
From d6e7a26ba969ae7ce45848ebb099c593e43c8a33 Mon Sep 17 00:00:00 2001
From: Harheem Kim <49297157+harheem@users.noreply.github.com>
Date: Tue, 4 Feb 2025 10:16:37 +0900
Subject: [PATCH 5/6] revert commit
---
...11-Convex-Combination-Ensemble-Retriever.ipynb | 15 +++++++--------
1 file changed, 7 insertions(+), 8 deletions(-)
diff --git a/10-Retriever/11-Convex-Combination-Ensemble-Retriever.ipynb b/10-Retriever/11-Convex-Combination-Ensemble-Retriever.ipynb
index 45a734572..87d0d994e 100644
--- a/10-Retriever/11-Convex-Combination-Ensemble-Retriever.ipynb
+++ b/10-Retriever/11-Convex-Combination-Ensemble-Retriever.ipynb
@@ -16,8 +16,7 @@
"\n",
"## Overview\n",
"\n",
- "This tutorial focuses on implementing and comparing different ensemble retrieval methods in LangChain. While LangChain's built-in `EnsembleRetriever` uses the **Reciprocal Rank Fusion (`RRF`)** method, we'll explore an additional approach by implementing the **Convex Combination (`CC`)** method.
\n",
- "The tutorial guides you through creating custom implementations of both **`RRF` and `CC` methods** , allowing for a direct performance comparison between these ensemble techniques.\n",
+ "This tutorial focuses on implementing and comparing different ensemble retrieval methods in LangChain. While LangChain's built-in EnsembleRetriever uses the Reciprocal Rank Fusion (RRF) method, we'll explore an additional approach by implementing the **Convex Combination (CC)** method. The tutorial guides you through creating custom implementations of both RRF and CC methods, allowing for a direct performance comparison between these ensemble techniques.\n",
"\n",
"### Table of Contents\n",
"\n",
@@ -30,7 +29,7 @@
"\n",
"### References\n",
"\n",
- "- [EnsembleRetriever](https://python.langchain.com/api_reference/langchain/retrievers/langchain.retrievers.ensemble.EnsembleRetriever.html)\n",
+ "- [LangChain Python API Reference > langchain: 0.3.14 > retrievers > EnsembleRetriever](https://python.langchain.com/api_reference/langchain/retrievers/langchain.retrievers.ensemble.EnsembleRetriever.html)\n",
"----"
]
},
@@ -117,7 +116,7 @@
"id": "690a9ae0",
"metadata": {},
"source": [
- "You can alternatively set `OPENAI_API_KEY` in **.env** file and load it.\n",
+ "You can alternatively set `OPENAI_API_KEY` in `.env` file and load it.\n",
"\n",
"[Note] This is not necessary if you've already set `OPENAI_API_KEY` in previous steps."
]
@@ -184,7 +183,7 @@
"source": [
"## Initialize Retrievers\n",
"\n",
- "This section initializes retrievers to implement two different search approaches. We create embeddings using OpenAI's `text-embedding-3-small` model and set up `FAISS` vector search based on these embeddings. \n",
+ "This section initializes retrievers to implement two different search approaches. We create embeddings using OpenAI's text-embedding-3-small model and set up `FAISS` vector search based on these embeddings. \n",
"\n",
"Additionally, we configure a `BM25` retriever for keyword-based search, with both retrievers set to return the top 5 most relevant results."
]
@@ -220,9 +219,9 @@
"source": [
"## Implement Ensemble Retrievers\n",
"\n",
- "This section introduces a custom retriever implementing two ensemble search methods, designed to compare performance against LangChain's built-in `EnsembleRetriever` . \n",
+ "This section introduces a custom retriever implementing two ensemble search methods, designed to compare performance against LangChain's built-in `EnsembleRetriever`. \n",
"\n",
- "We implement both **Reciprocal Rank Fusion (`RRF`)** , which combines results based on document rankings, and **Convex Combination (`CC`)** , which utilizes normalized scores. \n",
+ "We implement both Reciprocal Rank Fusion (RRF), which combines results based on document rankings, and **Convex Combination (CC)**, which utilizes normalized scores. \n",
"\n",
"Both methods integrate results from `FAISS` and `BM25` retrievers to provide more accurate and diverse search results, allowing users to select the most suitable ensemble approach for their needs."
]
@@ -365,7 +364,7 @@
"\n",
"This section presents a test function for comparing ensemble retrieval results. \n",
"\n",
- "While the **`RRF` method** , which follows LangChain's default implementation, produces identical results to **Original** , the **`CC` method** utilizing normalized scores and weights offers different search patterns. \n",
+ "While the 'RRF' method, which follows LangChain's default implementation, produces identical results to 'Original', the 'CC' method utilizing normalized scores and weights offers different search patterns. \n",
"\n",
"By testing with real queries and comparing these approaches, we can identify which ensemble method better suits our project requirements."
]
From c4a2377cfcda07f39d630376494eeea44b20e1a8 Mon Sep 17 00:00:00 2001
From: Harheem Kim <49297157+harheem@users.noreply.github.com>
Date: Tue, 4 Feb 2025 10:24:25 +0900
Subject: [PATCH 6/6] docs: rerun all notebook cells
---
.../08-Hierarchical-Multi-Agent-Teams.ipynb | 3265 ++++-------------
1 file changed, 707 insertions(+), 2558 deletions(-)
diff --git a/17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb b/17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb
index 142ea85e3..651eb3513 100644
--- a/17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb
+++ b/17-LangGraph/03-Use-Cases/08-Hierarchical-Multi-Agent-Teams.ipynb
@@ -66,7 +66,7 @@
},
{
"cell_type": "code",
- "execution_count": 16,
+ "execution_count": 1,
"metadata": {
"id": "dFiZS1vM3GB_"
},
@@ -78,7 +78,7 @@
},
{
"cell_type": "code",
- "execution_count": 17,
+ "execution_count": 2,
"metadata": {
"id": "X5qTujC_3Hbs"
},
@@ -116,9 +116,13 @@
},
{
"cell_type": "code",
- "execution_count": 18,
+ "execution_count": 3,
"metadata": {
- "id": "cgmm0kl61s0l"
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "cgmm0kl61s0l",
+ "outputId": "43d49049-86a5-4715-bc89-6fb452d111f3"
},
"outputs": [
{
@@ -158,22 +162,11 @@
},
{
"cell_type": "code",
- "execution_count": 19,
+ "execution_count": null,
"metadata": {
"id": "ML63b6Tu5QUO"
},
- "outputs": [
- {
- "data": {
- "text/plain": [
- "False"
- ]
- },
- "execution_count": 19,
- "metadata": {},
- "output_type": "execute_result"
- }
- ],
+ "outputs": [],
"source": [
"from dotenv import load_dotenv\n",
"\n",
@@ -199,7 +192,7 @@
},
{
"cell_type": "code",
- "execution_count": 20,
+ "execution_count": null,
"metadata": {
"id": "4vT-Mlb45XxH"
},
@@ -258,7 +251,7 @@
},
{
"cell_type": "code",
- "execution_count": 21,
+ "execution_count": 6,
"metadata": {
"id": "vzJ8v99U5gb3"
},
@@ -356,7 +349,7 @@
},
{
"cell_type": "code",
- "execution_count": 22,
+ "execution_count": 7,
"metadata": {
"id": "LJ15Z5Lh5ktg"
},
@@ -385,7 +378,7 @@
},
{
"cell_type": "code",
- "execution_count": 27,
+ "execution_count": 8,
"metadata": {
"id": "m-YR2KhBAR8Y"
},
@@ -433,7 +426,7 @@
},
{
"cell_type": "code",
- "execution_count": 28,
+ "execution_count": 9,
"metadata": {
"id": "f1UeOGUMAjPz"
},
@@ -458,7 +451,7 @@
},
{
"cell_type": "code",
- "execution_count": 29,
+ "execution_count": 10,
"metadata": {
"id": "_XlLBl1HAmSQ"
},
@@ -523,7 +516,7 @@
},
{
"cell_type": "code",
- "execution_count": 30,
+ "execution_count": 11,
"metadata": {
"id": "sUHIgF8ZAt4n"
},
@@ -581,7 +574,7 @@
},
{
"cell_type": "code",
- "execution_count": 31,
+ "execution_count": 12,
"metadata": {
"id": "e-JnX3jvA5Us"
},
@@ -604,8 +597,10 @@
},
{
"cell_type": "code",
- "execution_count": 40,
- "metadata": {},
+ "execution_count": 13,
+ "metadata": {
+ "id": "HoCCKZVSdU2a"
+ },
"outputs": [],
"source": [
"from langchain_opentutorial.graphs import visualize_graph\n",
@@ -615,9 +610,14 @@
},
{
"cell_type": "code",
- "execution_count": 36,
+ "execution_count": 14,
"metadata": {
- "id": "qcwMe2OoKwc3"
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 538
+ },
+ "id": "qcwMe2OoKwc3",
+ "outputId": "fb6a4ad7-fb39-42cf-aa96-59c2ad55b44e"
},
"outputs": [
{
@@ -672,8 +672,10 @@
},
{
"cell_type": "code",
- "execution_count": 41,
- "metadata": {},
+ "execution_count": 15,
+ "metadata": {
+ "id": "C5H2mHrjdU2a"
+ },
"outputs": [],
"source": [
"from langchain_core.runnables import RunnableConfig\n",
@@ -699,9 +701,13 @@
},
{
"cell_type": "code",
- "execution_count": 42,
+ "execution_count": 16,
"metadata": {
- "id": "pUIWVFjOLQoC"
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "pUIWVFjOLQoC",
+ "outputId": "bea38acb-6e07-48ac-e398-4a4f1c65af59"
},
"outputs": [
{
@@ -721,8 +727,8 @@
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"Tool Calls:\n",
- " scrape_webpages (call_1Z7X9foecra9LF4NABoCmYw9)\n",
- " Call ID: call_1Z7X9foecra9LF4NABoCmYw9\n",
+ " scrape_webpages (call_ASFYUuM6HWrcZu0sqXoBDoCI)\n",
+ " Call ID: call_ASFYUuM6HWrcZu0sqXoBDoCI\n",
" Args:\n",
" urls: ['https://finance.yahoo.com/']\n",
"==================================================\n",
@@ -734,110 +740,7 @@
"Name: scrape_webpages\n",
"\n",
"\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Yahoo Finance - Stock Market Live, Quotes, Business & Finance News \n",
- "\n",
- "\n",
- "\n",
- " News Today's news US Politics World Tech Reviews and deals Audio Computing Gaming Health Home Phones Science TVs Climate change Health Science Originals The 360 Newsletters Life Health COVID-19 Fall allergies Health news Mental health Relax Sexual health Studies The Unwind Parenting Family health So mini ways Style and beauty It Figures Unapologetically Horoscopes Shopping Buying guides Food Travel Autos Gift ideas Buying guides Entertainment Celebrity TV Movies Music How to Watch Interviews Videos Finance My Portfolio News Latest Stock Market Originals The Morning Brief Economies Earnings Tech Housing Crypto Markets Stocks: Most Actives Stocks: Gainers Stocks: Losers Trending Tickers Futures World Indices US Treasury Bonds Rates Currencies Crypto Top ETFs Top Mutual Funds Options: Highest Open Interest Options: Highest Implied Volatility Sectors Basic Materials Communication Services Consumer Cyclical Consumer Defensive Energy Financial Services Healthcare Industrials Real Estate Technology Utilities Research Screeners Calendar Stock Comparison Advanced Chart Currency Converter Personal Finance Credit Cards Banking Best HYSA Student Loans Personal Loans Insurance Mortgages Mortgage Calculator Taxes Videos Latest Editor's Picks Investing Insights Trending Stocks All Shows Morning Brief Opening Bid Wealth ETF Report Options 101 Davos 2025 Streaming Now Sports Fantasy News Fantasy football Best Ball Pro Pick 'Em College Pick 'Em Fantasy baseball Fantasy hockey Fantasy basketball Download the app Daily fantasy NFL News Scores and schedules Standings Stats Teams Players Drafts Injuries Odds Super Bowl GameChannel Videos MLB News Scores and schedules Standings Stats Teams Players Odds Videos World Baseball Classic NBA News Draft Scores and schedules Standings Stats Teams Players Injuries Videos Odds Playoffs NHL News Scores and schedules Standings Stats Teams Players Odds Playoffs Soccer News Scores and schedules Premier League MLS NWSL Liga MX CONCACAF League Champions League La Liga Serie A Bundesliga Ligue 1 World Cup College football News Scores and schedules Standings Rankings Stats Teams Show all MMA WNBA Sportsbook NCAAF Tennis Golf NASCAR NCAAB NCAAW Boxing USFL Cycling Motorsports Olympics Horse racing GameChannel Rivals Newsletters Podcasts Videos RSS Jobs Help World Cup More news New on Yahoo Creators Games Tech Local Services Terms Privacy Privacy Dashboard Feedback © 2025 All rights reserved. About our ads Advertising Careers Yahoo Finance Yahoo Finance Search query Select edition USEnglish US y LATAMEspañol AustraliaEnglish CanadaEnglish CanadaFrançais DeutschlandDeutsch FranceFrançais 香港繁中 MalaysiaEnglish New ZealandEnglish SingaporeEnglish 台灣繁中 UKEnglish News Finance Sports More News Today's news US Politics World Weather Climate change Health Science Originals Newsletters Life Health Parenting Style and beauty Horoscopes Shopping Food Travel Autos Gift ideas Buying guides Entertainment Celebrity TV Movies Music How to watch Interviews Videos Finance My portfolio Watchlists Markets News Videos Screeners Personal finance Crypto Industries Sports Fantasy NFL NBA MLB NHL College football College basketball Soccer MMA Yahoo Sports AM New on Yahoo Creators Games Tech Local Services Selected edition USEnglish Mail Sign in My Portfolio News Latest Stock Market Originals The Morning Brief Economies Earnings Tech Housing Crypto Markets Stocks: Most Actives Stocks: Gainers Stocks: Losers Trending Tickers Futures World Indices US Treasury Bonds Rates Currencies Crypto Top ETFs Top Mutual Funds Options: Highest Open Interest Options: Highest Implied Volatility Sectors Basic Materials Communication Services Consumer Cyclical Consumer Defensive Energy Financial Services Healthcare Industrials Real Estate Technology Utilities Research Screeners Calendar Stock Comparison Advanced Chart Currency Converter Personal Finance Credit Cards Banking Best HYSA Student Loans Personal Loans Insurance Mortgages Mortgage Calculator Taxes Videos Latest Editor's Picks Investing Insights Trending Stocks All Shows Morning Brief Opening Bid Wealth ETF Report Options 101 Davos 2025 Streaming Now … Trump's 'one very big power over China' could seal the fate of TikTok President Donald Trump indicated that imposing tariffs on China could be an outcome of a TikTok deal falling through. Yahoo Finance • 8 hours ago ^DJI -0.32% AI-driven power boom will drive demand 38% higher on top US grid Bloomberg • 2 hours ago Banks prepare to offload as much as $3 billion of X buyout debt Bloomberg • 2 hours ago Stocks stall but cap winning week as Trump's remarks move markets Yahoo Finance • 4 hours ago ^IXIC -0.50% ES=F -0.33% Ray Dalio: It's time for society to think about alternative money Yahoo Finance • 6 hours ago DX=F -0.56% BofA, JPMorgan to lobby DC after conservative criticism Reuters • 2 hours ago BAC +0.28% Target ending DEI initiatives amid Trump order on diversity programs Reuters • 2 hours ago TGT +0.47% Apple tests key technical level in worst start since 2008 Bloomberg • 4 hours ago AAPL -0.39% ^GSPC -0.29% Bankrupt Purdue buys time to advance $7.4 billion opioid deal Reuters • 4 hours ago Tesla launches refreshed Model Y in bid to boost sales Yahoo Finance • 10 hours ago TSLA -1.41% View More Latest In motion to dismiss, chatbot platform Character AI claims it is protected by the First Amendment TechCrunch • 1 hour ago Meta reveals how it plans to make money on Threads, its X competitor with 300 million users Business Insider • 1 hour ago Banks Prepare to Sell as Much as $3 Billion of X Buyout Debt Bloomberg • 2 hours ago AI-Driven Power Boom Will Drive Demand 38% Higher on Top US Grid Bloomberg • 2 hours ago Did Trump's Crypto Order Deliver On His Promise Of A Bitcoin Strategic Reserve? Investopedia • 2 hours ago Meta Plans to Spend as Much as $65B in 'Defining Year for AI' Says Zuckerberg Investopedia • 2 hours ago META +1.73% Existing Home Sales Hit a Multidecade Low. Why Are Economists Upbeat About the Housing Market? Investopedia • 2 hours ago Texas Instruments Stock Leads Analog Chip Shares Lower Investopedia • 2 hours ago TXN -7.52% ADI -4.81% Popular Tesla Just Raised Prices in a Key Market. What It Means for the Stock. Barrons.com • yesterday TSLA -1.41% Lightspeed’s $2 Billion Anthropic Megadeal Cements VC Firm’s AI Ambitions Bloomberg • 5 hours ago As Apple Earnings Near, More Investors Think the iPhone Maker’s Stock Is Primed to Fall Barrons.com • yesterday AAPL -0.39% American Express Earnings Show Its Card Members Keep Spending. Why the Stock Is Falling. Barrons.com • 11 hours ago AXP -1.39% Crypto's charms leave some investors in Davos cold Reuters • yesterday Analyst Report: Regions Financial Corporation Morningstar Research • 4 hours ago RF +0.87% RF-PE -0.49% MicroStrategy Suddenly Has a Tax Problem, and Needs Help From Trump’s IRS The Wall Street Journal • 15 hours ago BRK-B +0.73% MSTR -5.21% Big Tech earnings, Fed rate decision: What to Watch Next Week Market Domination Overtime hosts Julie Hyman and Josh Lipton detail the top market events on the horizon for the upcoming week. A slew of Big Tech earnings is scheduled for release, with industry giants Tesla (TSLA), Microsoft (MSFT), Meta Platforms (META), and Apple (AAPL) preparing to unveil their quarterly results. Artificial intelligence will be a central theme in these earnings reports. Additionally, the Federal Reserve's January Federal Open Market Committee (FOMC) meeting will commence on Tuesday, with the central bank's interest rate decision set to be announced on Wednesday at 2 p.m. ET. To watch more expert insights and analysis on the latest market action, check out more Market Domination Overtime here. This post was written by Angel Smith 2h ago META +1.73% TSLA -1.41% 01:01 President Trump, Davos 2025, Netflix's Q4 earnings: ICYMI Yahoo Finance Video • 2h ago 05:40 Fed will do 'whatever it takes' to hit dual mandate Yahoo Finance Video • 2h ago ^GSPC -0.29% ^DJI -0.32% 03:08 Strong US dollar is sending Americans to Europe. Here's why. Yahoo Finance Video • 2h ago DX=F -0.56% DX-Y.NYB -0.54% 03:14 Stargate & OpenAI announcement is 'net positive' for Microsoft Yahoo Finance Video • 3h ago MSFT -0.59% View More Investment Ideas Build Your Wealth View More More News Copyright © 2025 Yahoo. All rights reserved. POPULAR QUOTES Dow Jones S&P 500 DAX Index Nvidia Tesla DJT EXPLORE MORE Mortgages Credit Cards Sectors Crypto Heatmap Financial News ABOUT Data Disclaimer Help Feedback Sitemap Licensing What's New About Our Ads Terms and Privacy Policy Privacy Dashboard U.S. markets closed US Europe Asia Rates Commodities Currencies Cryptocurrencies S&P 500 6,101.24 -17.47 (-0.29%) Dow 30 44,424.25 -140.82 (-0.32%) Nasdaq 19,954.30 -99.38 (-0.50%) Russell 2000 2,307.74 -6.86 (-0.30%) Crude Oil 74.60 -0.02 (-0.03%) Gold 2,777.40 +12.40 (+0.45%) My Portfolios My Portfolios Sign in to access your portfolio Sign in Top Gainers TWLO Twilio Inc. 136.23 +22.83 (+20.13%) MIDD The Middleby Corporation 169.72 +24.07 (+16.53%) SGHC Super Group (SGHC) Limited 7.90 +1.09 (+16.01%) DAN Dana Incorporated 15.20 +1.97 (+14.89%) KC Kingsoft Cloud Holdings Limited 10.79 +1.25 (+13.10%) Top Losers ERIC Telefonaktiebolaget LM Ericsson (publ) 7.62 -1.28 (-14.38%) FNMA Federal National Mortgage Association 5.73 -0.77 (-11.85%) FTAI FTAI Aviation Ltd. 90.98 -10.52 (-10.36%) FMCC Federal Home Loan Mortgage Corporation 5.24 -0.44 (-7.75%) TXN Texas Instruments Incorporated 185.52 -15.09 (-7.52%) Most Active RGTI Rigetti Computing, Inc. 13.20 -0.27 (-2.00%) NVDA NVIDIA Corporation 142.62 -4.60 (-3.12%) LCID Lucid Group, Inc. 2.7900 +0.0700 (+2.57%) PLTR Palantir Technologies Inc. 78.98 0.00 (0.00%) BBD Banco Bradesco S.A. 1.9300 -0.0100 (-0.52%) Trending Tickers NVNI Nvni Group Limited 4.4000 +2.9100 (+195.30%) GOEV Canoo Inc. 0.4750 +0.1450 (+43.94%) NVO Novo Nordisk A/S 87.97 +6.87 (+8.47%) VG Vonage Holdings Corp. 24.00 -1.00 (-4.00%) MSTR MicroStrategy Incorporated 353.67 -19.45 (-5.21%) Top Economic Events Terms and Privacy Policy Privacy Dashboard Ad Terms Feedback \n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
+ "Yahoo Finance - Stock Market Live, Quotes, Business & Finance News News Today's news US Politics World Tech Reviews and deals Audio Computing Gaming Health Home Phones Science TVs Climate change Health Science Originals The 360 Newsletters Life Health COVID-19 Fall allergies Health news Mental health Relax Sexual health Studies The Unwind Parenting Family health So mini ways Style and beauty It Figures Unapologetically Horoscopes Shopping Buying guides Food Travel Autos Gift ideas Buying guides Entertainment Celebrity TV Movies Music How to Watch Interviews Videos Finance My Portfolio News Latest Stock Market Originals The Morning Brief Economies Tariff Updates Premium News Earnings Tech Housing Crypto Markets Stocks: Most Actives Stocks: Gainers Stocks: Losers Trending Tickers Futures World Indices US Treasury Bonds Rates Currencies Crypto Top ETFs Top Mutual Funds Options: Highest Open Interest Options: Highest Implied Volatility Sectors Basic Materials Communication Services Consumer Cyclical Consumer Defensive Energy Financial Services Healthcare Industrials Real Estate Technology Utilities Research Screeners Calendar Stock Comparison Advanced Chart Currency Converter Investment Ideas Research Reports Personal Finance Credit Cards Banking Best HYSA Student Loans Personal Loans Insurance Mortgages Mortgage Calculator Taxes Videos Latest Editor's Picks Investing Insights Trending Stocks All Shows Morning Brief Opening Bid Wealth ETF Report Options 101 Davos 2025 Streaming Now Sports Fantasy News Fantasy football Best Ball Pro Pick 'Em College Pick 'Em Fantasy baseball Fantasy hockey Fantasy basketball Download the app Daily fantasy NFL News Scores and schedules Standings Stats Teams Players Drafts Injuries Odds Super Bowl GameChannel Videos MLB News Scores and schedules Standings Stats Teams Players Odds Videos World Baseball Classic NBA News Draft Scores and schedules Standings Stats Teams Players Injuries Videos Odds Playoffs NHL News Scores and schedules Standings Stats Teams Players Odds Playoffs Soccer News Scores and schedules Premier League MLS NWSL Liga MX CONCACAF League Champions League La Liga Serie A Bundesliga Ligue 1 World Cup College football News Scores and schedules Standings Rankings Stats Teams Show all MMA WNBA Sportsbook NCAAF Tennis Golf NASCAR NCAAB NCAAW Boxing USFL Cycling Motorsports Olympics Horse racing GameChannel Rivals Newsletters Podcasts Videos RSS Jobs Help World Cup More news New on Yahoo Creators Games Tech Local Services Terms Privacy Privacy Dashboard Feedback © 2025 All rights reserved. About our ads Advertising Careers Yahoo Finance Yahoo Finance Search query Select edition USEnglish US y LATAMEspañol AustraliaEnglish CanadaEnglish CanadaFrançais DeutschlandDeutsch FranceFrançais 香港繁中 MalaysiaEnglish New ZealandEnglish SingaporeEnglish 台灣繁中 UKEnglish News Finance Sports More News Today's news US Politics World Weather Climate change Health Science Originals Newsletters Life Health Parenting Style and beauty Horoscopes Shopping Food Travel Autos Gift ideas Buying guides Entertainment Celebrity TV Movies Music How to watch Interviews Videos Finance My portfolio Watchlists Markets News Videos Screeners Personal finance Crypto Industries Sports Fantasy NFL NBA MLB NHL College football College basketball Soccer MMA Yahoo Sports AM New on Yahoo Creators Games Tech Local Services Selected edition USEnglish Mail Sign in My Portfolio News Latest Stock Market Originals The Morning Brief Economies Tariff Updates Premium News Earnings Tech Housing Crypto Markets Stocks: Most Actives Stocks: Gainers Stocks: Losers Trending Tickers Futures World Indices US Treasury Bonds Rates Currencies Crypto Top ETFs Top Mutual Funds Options: Highest Open Interest Options: Highest Implied Volatility Sectors Basic Materials Communication Services Consumer Cyclical Consumer Defensive Energy Financial Services Healthcare Industrials Real Estate Technology Utilities Research Screeners Calendar Stock Comparison Advanced Chart Currency Converter Investment Ideas Research Reports Personal Finance Credit Cards Banking Best HYSA Student Loans Personal Loans Insurance Mortgages Mortgage Calculator Taxes Videos Latest Editor's Picks Investing Insights Trending Stocks All Shows Morning Brief Opening Bid Wealth ETF Report Options 101 Davos 2025 Streaming Now … Upgrade to Premium Trump agrees to delay tariffs on Canada and Mexico The US neighbors committed to sending more resources to their borders. Tariffs on China are still set to take effect Tuesday. LIVE Yahoo Finance • 2 hours ago Trump's 10% tariffs on China could hit Big Tech hard Yahoo Finance • 3 hours ago AMD -1.45% QCOM -1.56% Palantir surges after upbeat revenue forecast on AI strength Reuters • 1 hour ago PLTR +1.52% Dow, S&P 500, Nasdaq futures rise as Trump pauses tariffs on Canada, Mexico LIVE Yahoo Finance • 7 minutes ago TSLA -5.17% NQ=F +0.71% Goolsbee says Trump policies could slow Fed's rate cutting Bloomberg • 1 hour ago US Treasury chief takes over CFPB, freezes all activity Reuters • 2 hours ago Target is sued for defrauding shareholders about DEI Reuters • 2 hours ago TGT -2.72% Canada's dollar, Mexican peso rebound as US tariffs delayed Bloomberg • 2 hours ago Trump says sovereign wealth fund could play role in TikTok rescue Yahoo Finance • 5 hours ago NVDA -2.84% AMZN -0.11% Tesla stock sinks, leading auto stocks down Yahoo Finance • 3 hours ago TM -2.79% TOYOF +0.69% View More Latest Toyota likely to post second straight profit drop as sales growth cools Reuters • 4 minutes ago TM -2.79% China May Loosen Grip on Yuan as Trump Reignites Trade War Bloomberg • 25 minutes ago ANZGF -0.06% Salesforce to cut 1,000 roles, Bloomberg News reports Reuters • 38 minutes ago CRM -0.72% Asian Shares Advance After Trump Delays Tariffs: Markets Wrap Bloomberg • 47 minutes ago ^GSPC -0.76% Oil Falls After Trump Delays Canada, Mexico Tariffs by a Month Bloomberg • 48 minutes ago Salesforce Cutting 1,000 Roles While Hiring Salespeople for AI Bloomberg • 51 minutes ago CRM -0.72% Fuji Media’s Crisis Entices Investors Betting on Changes Bloomberg • 1 hour ago Palantir Surges on Outlook Fueled by ‘Untamed’ AI Demand Bloomberg • 1 hour ago PLTR +1.52% Popular When will I get my W-2? Deadline for employers to send was Jan. 31 USA TODAY • yesterday Trump has confused the C-suite Yahoo Finance • yesterday IBM +1.97% GM -3.15% Another Thing Musk Hates About Germany: Absentee Workers at His Tesla Plant The Wall Street Journal • 9 hours ago TSLA -5.17% Wall Street girds for market impact of Trump tariffs Reuters • yesterday Australia shoppers lured by discounts, shore up economy in 4Q Reuters • 23 hours ago Analyst Report: W.W. Grainger, Inc. Morningstar Research • 2 hours ago GWW -1.50% Disney Stock Has a Lot to Prove This Week Motley Fool • 12 hours ago DIS +0.83% Market forecast: What February historically means for equities January was a strong month for stocks (^GSPC, ^IXIC, ^DJI), but history suggests a rocky February ahead for equities. Yahoo Finance markets and data editor Jared Blikre appears on Catalysts to explain that February typically sees a market rise early in the month, peaking around mid-February and then followed by a decline. He notes potential bearish signals, like rising tariffs and a spike in the volatility index (^VIX), and which sectors are facing the highest wave of volatility. To watch more expert insights and analysis on the latest market action, check out more Catalysts here. This post was written by Josh Lynch 7h ago ^IXIC -1.20% XLV +0.39% 05:06 This generation is leading AI use at work, and it's not Gen Z Yahoo Finance Video • 4h ago 04:37 The 'three-fold' applications of Trump's tariff strategy Yahoo Finance Video • 3h ago 05:49 Palantir's commercial sales 'momentum' may continue in 2025 Yahoo Finance Video • 3h ago PLTR +1.52% 02:28 Companies with tariff-prepared CEOs will 'be rewarded' Yahoo Finance Video • 6h ago META +1.20% AMZN -0.11% View More Investment Ideas Build Your Wealth View More More News Copyright © 2025 Yahoo. All rights reserved. POPULAR QUOTES Dow Jones S&P 500 DAX Index Nvidia Tesla DJT Tariffs EXPLORE MORE Mortgages Credit Cards Sectors Crypto Heatmap Financial News ABOUT Data Disclaimer Help Feedback Sitemap Licensing What's New About Our Ads Premium Plans Terms and Privacy Policy Privacy Dashboard U.S. markets closed US Europe Asia Rates Commodities Currencies Cryptocurrencies S&P 500 5,994.57 -45.96 (-0.76%) Dow 30 44,421.91 -122.75 (-0.28%) Nasdaq 19,391.96 -235.49 (-1.20%) Russell 2000 2,258.42 -29.28 (-1.28%) Crude Oil 72.30 -0.86 (-1.18%) Gold 2,849.40 -7.70 (-0.27%) My Portfolios My Portfolios Sign in to access your portfolio Sign in Top Gainers KC Kingsoft Cloud Holdings Limited 15.14 +2.57 (+20.45%) GDS GDS Holdings Limited 25.68 +3.97 (+18.29%) IDXX IDEXX Laboratories, Inc. 469.04 +46.99 (+11.13%) OKLO Oklo Inc. 45.93 +4.32 (+10.38%) PTCT PTC Therapeutics, Inc. 50.01 +4.13 (+9.00%) Top Losers BRZE Braze, Inc. 41.52 -4.46 (-9.70%) BOOT Boot Barn Holdings, Inc. 145.52 -15.33 (-9.53%) BBIO BridgeBio Pharma, Inc. 31.03 -3.18 (-9.30%) PII Polaris Inc. 43.70 -4.00 (-8.39%) JBLU JetBlue Airways Corporation 6.04 -0.54 (-8.21%) Most Active NVDA NVIDIA Corporation 116.66 -3.41 (-2.84%) RGTI Rigetti Computing, Inc. 13.47 +0.30 (+2.28%) F Ford Motor Company 9.89 -0.19 (-1.88%) TSLA Tesla, Inc. 383.68 -20.92 (-5.17%) LCID Lucid Group, Inc. 2.8000 +0.0400 (+1.45%) Trending Tickers PLTR Palantir Technologies Inc. 83.74 +1.25 (+1.52%) SMCI Super Micro Computer, Inc. 26.85 -1.67 (-5.86%) COST Costco Wholesale Corporation 1,005.83 +25.95 (+2.65%) TSLA Tesla, Inc. 383.68 -20.92 (-5.17%) OKLO Oklo Inc. 45.93 +4.32 (+10.38%) Top Economic Events Terms and Privacy Policy Privacy Dashboard Ad Terms Feedback\n",
"\n",
"==================================================\n",
"\n",
@@ -848,25 +751,21 @@
"\n",
"Here are the main news highlights from Yahoo Finance:\n",
"\n",
- "1. **Trump's Influence on TikTok**: Former President Donald Trump suggested that imposing tariffs on China could be a consequence if a deal regarding TikTok fails. [Source](https://finance.yahoo.com/)\n",
+ "1. **Trump Delays Tariffs on Canada and Mexico**: President Trump has agreed to delay the implementation of tariffs on Canada and Mexico, as both countries committed to sending more resources to their borders. However, tariffs on China are still set to take effect soon. [Source](https://finance.yahoo.com)\n",
"\n",
- "2. **AI-Driven Power Demand**: A report indicates that demand for power on the US grid is expected to rise by 38% due to advancements in AI technology. [Source](https://finance.yahoo.com/)\n",
+ "2. **Impact of Tariffs on Big Tech**: Analysts warn that Trump's 10% tariffs on China could significantly impact major technology companies. [Source](https://finance.yahoo.com)\n",
"\n",
- "3. **Bank Debt Offloading**: Banks are preparing to sell up to $3 billion in debt related to the acquisition of X, the social media platform. [Source](https://finance.yahoo.com/)\n",
+ "3. **Palantir's Revenue Forecast**: Palantir Technologies saw a surge in its stock price following an optimistic revenue forecast driven by strong demand for AI solutions. [Source](https://finance.yahoo.com)\n",
"\n",
- "4. **Market Reactions to Trump's Remarks**: Stock markets experienced a stall but capped a winning week following Trump's comments, which influenced market movements. [Source](https://finance.yahoo.com/)\n",
+ "4. **Market Reactions**: Futures for the Dow, S&P 500, and Nasdaq rose after the announcement of the tariff delays. [Source](https://finance.yahoo.com)\n",
"\n",
- "5. **Ray Dalio on Alternative Money**: Investor Ray Dalio emphasized the need for society to consider alternative forms of money. [Source](https://finance.yahoo.com/)\n",
+ "5. **Target Faces Lawsuit**: Target is being sued for allegedly defrauding shareholders regarding its diversity, equity, and inclusion (DEI) initiatives. [Source](https://finance.yahoo.com)\n",
"\n",
- "6. **Target's Diversity Initiatives**: Target is set to end its diversity, equity, and inclusion initiatives in response to an executive order from Trump. [Source](https://finance.yahoo.com/)\n",
+ "6. **Salesforce Job Cuts**: Salesforce is reportedly cutting 1,000 jobs while simultaneously hiring for roles related to AI. [Source](https://finance.yahoo.com)\n",
"\n",
- "7. **Apple's Stock Performance**: Apple is testing a critical technical level as it faces its worst stock performance start since 2008. [Source](https://finance.yahoo.com/)\n",
+ "7. **Market Forecast for February**: Historical trends suggest that February may be a rocky month for equities, despite a strong January performance. [Source](https://finance.yahoo.com)\n",
"\n",
- "8. **Purdue Pharma's Bankruptcy**: Purdue Pharma is seeking to advance a $7.4 billion settlement related to its opioid crisis bankruptcy. [Source](https://finance.yahoo.com/)\n",
- "\n",
- "9. **Tesla's Model Y Refresh**: Tesla has launched a refreshed version of its Model Y in an effort to boost sales. [Source](https://finance.yahoo.com/)\n",
- "\n",
- "For more details, you can visit [Yahoo Finance](https://finance.yahoo.com/).\n",
+ "For more detailed information, you can visit [Yahoo Finance](https://finance.yahoo.com).\n",
"==================================================\n",
"\n",
"==================================================\n",
@@ -877,25 +776,21 @@
"\n",
"Here are the main news highlights from Yahoo Finance:\n",
"\n",
- "1. **Trump's Influence on TikTok**: Former President Donald Trump suggested that imposing tariffs on China could be a consequence if a deal regarding TikTok fails. [Source](https://finance.yahoo.com/)\n",
- "\n",
- "2. **AI-Driven Power Demand**: A report indicates that demand for power on the US grid is expected to rise by 38% due to advancements in AI technology. [Source](https://finance.yahoo.com/)\n",
- "\n",
- "3. **Bank Debt Offloading**: Banks are preparing to sell up to $3 billion in debt related to the acquisition of X, the social media platform. [Source](https://finance.yahoo.com/)\n",
+ "1. **Trump Delays Tariffs on Canada and Mexico**: President Trump has agreed to delay the implementation of tariffs on Canada and Mexico, as both countries committed to sending more resources to their borders. However, tariffs on China are still set to take effect soon. [Source](https://finance.yahoo.com)\n",
"\n",
- "4. **Market Reactions to Trump's Remarks**: Stock markets experienced a stall but capped a winning week following Trump's comments, which influenced market movements. [Source](https://finance.yahoo.com/)\n",
+ "2. **Impact of Tariffs on Big Tech**: Analysts warn that Trump's 10% tariffs on China could significantly impact major technology companies. [Source](https://finance.yahoo.com)\n",
"\n",
- "5. **Ray Dalio on Alternative Money**: Investor Ray Dalio emphasized the need for society to consider alternative forms of money. [Source](https://finance.yahoo.com/)\n",
+ "3. **Palantir's Revenue Forecast**: Palantir Technologies saw a surge in its stock price following an optimistic revenue forecast driven by strong demand for AI solutions. [Source](https://finance.yahoo.com)\n",
"\n",
- "6. **Target's Diversity Initiatives**: Target is set to end its diversity, equity, and inclusion initiatives in response to an executive order from Trump. [Source](https://finance.yahoo.com/)\n",
+ "4. **Market Reactions**: Futures for the Dow, S&P 500, and Nasdaq rose after the announcement of the tariff delays. [Source](https://finance.yahoo.com)\n",
"\n",
- "7. **Apple's Stock Performance**: Apple is testing a critical technical level as it faces its worst stock performance start since 2008. [Source](https://finance.yahoo.com/)\n",
+ "5. **Target Faces Lawsuit**: Target is being sued for allegedly defrauding shareholders regarding its diversity, equity, and inclusion (DEI) initiatives. [Source](https://finance.yahoo.com)\n",
"\n",
- "8. **Purdue Pharma's Bankruptcy**: Purdue Pharma is seeking to advance a $7.4 billion settlement related to its opioid crisis bankruptcy. [Source](https://finance.yahoo.com/)\n",
+ "6. **Salesforce Job Cuts**: Salesforce is reportedly cutting 1,000 jobs while simultaneously hiring for roles related to AI. [Source](https://finance.yahoo.com)\n",
"\n",
- "9. **Tesla's Model Y Refresh**: Tesla has launched a refreshed version of its Model Y in an effort to boost sales. [Source](https://finance.yahoo.com/)\n",
+ "7. **Market Forecast for February**: Historical trends suggest that February may be a rocky month for equities, despite a strong January performance. [Source](https://finance.yahoo.com)\n",
"\n",
- "For more details, you can visit [Yahoo Finance](https://finance.yahoo.com/).\n",
+ "For more detailed information, you can visit [Yahoo Finance](https://finance.yahoo.com).\n",
"==================================================\n",
"\n",
"==================================================\n",
@@ -916,9 +811,13 @@
},
{
"cell_type": "code",
- "execution_count": 43,
+ "execution_count": 17,
"metadata": {
- "id": "z6fQqA6VLT0Y"
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "z6fQqA6VLT0Y",
+ "outputId": "0ec2325e-5084-4ce7-bf4b-62b14d71e309"
},
"outputs": [
{
@@ -927,25 +826,21 @@
"text": [
"Here are the main news highlights from Yahoo Finance:\n",
"\n",
- "1. **Trump's Influence on TikTok**: Former President Donald Trump suggested that imposing tariffs on China could be a consequence if a deal regarding TikTok fails. [Source](https://finance.yahoo.com/)\n",
- "\n",
- "2. **AI-Driven Power Demand**: A report indicates that demand for power on the US grid is expected to rise by 38% due to advancements in AI technology. [Source](https://finance.yahoo.com/)\n",
+ "1. **Trump Delays Tariffs on Canada and Mexico**: President Trump has agreed to delay the implementation of tariffs on Canada and Mexico, as both countries committed to sending more resources to their borders. However, tariffs on China are still set to take effect soon. [Source](https://finance.yahoo.com)\n",
"\n",
- "3. **Bank Debt Offloading**: Banks are preparing to sell up to $3 billion in debt related to the acquisition of X, the social media platform. [Source](https://finance.yahoo.com/)\n",
+ "2. **Impact of Tariffs on Big Tech**: Analysts warn that Trump's 10% tariffs on China could significantly impact major technology companies. [Source](https://finance.yahoo.com)\n",
"\n",
- "4. **Market Reactions to Trump's Remarks**: Stock markets experienced a stall but capped a winning week following Trump's comments, which influenced market movements. [Source](https://finance.yahoo.com/)\n",
+ "3. **Palantir's Revenue Forecast**: Palantir Technologies saw a surge in its stock price following an optimistic revenue forecast driven by strong demand for AI solutions. [Source](https://finance.yahoo.com)\n",
"\n",
- "5. **Ray Dalio on Alternative Money**: Investor Ray Dalio emphasized the need for society to consider alternative forms of money. [Source](https://finance.yahoo.com/)\n",
+ "4. **Market Reactions**: Futures for the Dow, S&P 500, and Nasdaq rose after the announcement of the tariff delays. [Source](https://finance.yahoo.com)\n",
"\n",
- "6. **Target's Diversity Initiatives**: Target is set to end its diversity, equity, and inclusion initiatives in response to an executive order from Trump. [Source](https://finance.yahoo.com/)\n",
+ "5. **Target Faces Lawsuit**: Target is being sued for allegedly defrauding shareholders regarding its diversity, equity, and inclusion (DEI) initiatives. [Source](https://finance.yahoo.com)\n",
"\n",
- "7. **Apple's Stock Performance**: Apple is testing a critical technical level as it faces its worst stock performance start since 2008. [Source](https://finance.yahoo.com/)\n",
+ "6. **Salesforce Job Cuts**: Salesforce is reportedly cutting 1,000 jobs while simultaneously hiring for roles related to AI. [Source](https://finance.yahoo.com)\n",
"\n",
- "8. **Purdue Pharma's Bankruptcy**: Purdue Pharma is seeking to advance a $7.4 billion settlement related to its opioid crisis bankruptcy. [Source](https://finance.yahoo.com/)\n",
+ "7. **Market Forecast for February**: Historical trends suggest that February may be a rocky month for equities, despite a strong January performance. [Source](https://finance.yahoo.com)\n",
"\n",
- "9. **Tesla's Model Y Refresh**: Tesla has launched a refreshed version of its Model Y in an effort to boost sales. [Source](https://finance.yahoo.com/)\n",
- "\n",
- "For more details, you can visit [Yahoo Finance](https://finance.yahoo.com/).\n"
+ "For more detailed information, you can visit [Yahoo Finance](https://finance.yahoo.com).\n"
]
}
],
@@ -966,7 +861,7 @@
},
{
"cell_type": "code",
- "execution_count": 44,
+ "execution_count": 18,
"metadata": {
"id": "NIWgweiPLg0j"
},
@@ -1073,7 +968,7 @@
},
{
"cell_type": "code",
- "execution_count": 45,
+ "execution_count": 19,
"metadata": {
"id": "EVdwBalILuGx"
},
@@ -1123,9 +1018,14 @@
},
{
"cell_type": "code",
- "execution_count": 46,
+ "execution_count": 20,
"metadata": {
- "id": "oax44hlALyP_"
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 538
+ },
+ "id": "oax44hlALyP_",
+ "outputId": "21945734-3e9e-4486-a06a-1cf3f529dfe6"
},
"outputs": [
{
@@ -1155,9 +1055,13 @@
},
{
"cell_type": "code",
- "execution_count": 47,
+ "execution_count": 21,
"metadata": {
- "id": "uL-dS1MhL1Ak"
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "uL-dS1MhL1Ak",
+ "outputId": "1b9a0060-c5c2-4b06-9274-0aa7a7fbbe12"
},
"outputs": [
{
@@ -1177,82 +1081,68 @@
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"Tool Calls:\n",
- " write_document (call_o7N3RtIZk6sYIyGbr1tahvOi)\n",
- " Call ID: call_o7N3RtIZk6sYIyGbr1tahvOi\n",
+ " write_document (call_Gzcvkmtplu3XA7U4O17i5u5F)\n",
+ " Call ID: call_Gzcvkmtplu3XA7U4O17i5u5F\n",
" Args:\n",
" content: # In-Depth Analysis of the Transformer Architecture\n",
"\n",
"## Table of Contents\n",
"1. Introduction \n",
"2. Background \n",
- " 2.1 Sequence-to-Sequence Models \n",
- " 2.2 Attention Mechanism \n",
- "3. Transformer Architecture \n",
- " 3.1 Overview \n",
- " 3.2 Encoder-Decoder Structure \n",
- " 3.3 Multi-Head Attention \n",
- " 3.4 Positional Encoding \n",
+ "3. Transformer's Architecture \n",
+ " 3.1. Multi-Head Attention \n",
+ " 3.2. Position-wise Feed-Forward Networks \n",
+ " 3.3. Positional Encoding \n",
"4. Training the Transformer \n",
- " 4.1 Loss Functions \n",
- " 4.2 Optimization Techniques \n",
- "5. Applications of Transformers \n",
- " 5.1 Natural Language Processing \n",
- " 5.2 Vision Transformers \n",
- "6. Advantages and Limitations \n",
- " 6.1 Advantages \n",
- " 6.2 Limitations \n",
- "7. Future Directions \n",
- "8. Conclusion \n",
+ " 4.1. Loss Functions \n",
+ " 4.2. Optimization Techniques \n",
+ "5. Applications of Transformer Architecture \n",
+ " 5.1. Natural Language Processing \n",
+ " 5.2. Computer Vision \n",
+ " 5.3. Speech Recognition \n",
+ "6. Conclusion \n",
"\n",
"## 1. Introduction \n",
- "The Transformer architecture has revolutionized the field of machine learning, particularly in natural language processing (NLP). Introduced by Vaswani et al. in 2017, the Transformer model leverages self-attention mechanisms to effectively process sequential data. Unlike recurrent neural networks (RNNs) that process data sequentially, Transformers allow for parallelization, significantly improving training times and efficiency. This paper provides an in-depth analysis of the Transformer architecture, detailing its components, training methodologies, applications, and future potential. The goal is to provide a comprehensive understanding of how Transformers function and their impact on various domains.\n",
+ "The Transformer architecture has revolutionized the field of machine learning, particularly in natural language processing (NLP). Introduced in the paper \"Attention is All You Need\" by Vaswani et al., the Transformer model is unique in its reliance on self-attention mechanisms instead of recurrent or convolutional layers. This allows for better parallelization during training and greater capability to handle long-range dependencies in data. The architecture has paved the way for state-of-the-art models like BERT, GPT, and T5. This paper aims to provide an in-depth analysis of the architecture, its components, and its various applications.\n",
"\n",
"## 2. Background \n",
- "### 2.1 Sequence-to-Sequence Models \n",
- "Sequence-to-sequence (seq2seq) models are designed to transform an input sequence into an output sequence. Traditionally, these models utilized RNNs to encode the input and decode the output. While effective, RNNs suffer from limitations such as difficulty in capturing long-range dependencies and slower training due to sequential processing. The introduction of the Transformer model addresses these challenges by eliminating recurrence and employing self-attention mechanisms. This enables the model to consider the entire input sequence simultaneously, allowing for better performance on tasks like translation and summarization.\n",
- "\n",
- "### 2.2 Attention Mechanism \n",
- "The attention mechanism is a critical component of the Transformer architecture, enabling the model to focus on specific parts of the input sequence. It assigns different weights to different words in the input based on their relevance to the current decoding step. This allows the model to capture contextual relationships more effectively than traditional methods. Self-attention, a variant of the attention mechanism, calculates attention scores by comparing all elements of the input sequence against each other. This approach facilitates the modeling of dependencies across long distances, making it particularly powerful for language tasks.\n",
+ "Before the advent of the Transformer model, traditional neural networks utilized recurrent architectures to process sequential data. Recurrent Neural Networks (RNNs) and Long Short-Term Memory (LSTM) networks were the go-to choices for tasks involving sequences due to their ability to maintain hidden states across time steps. However, these methods faced challenges such as long training times and difficulties in handling long-range dependencies. With the introduction of self-attention mechanisms in the Transformer, these challenges became less pronounced, leading to significant improvements in performance and efficiency. Understanding the limitations of previous architectures sets the stage for appreciating the innovations brought forth by the Transformer.\n",
"\n",
- "## 3. Transformer Architecture \n",
- "### 3.1 Overview \n",
- "The Transformer architecture consists of an encoder-decoder structure, with each part containing multiple layers of self-attention and feed-forward neural networks. The encoder processes the input sequence and generates a set of continuous representations, which are then passed to the decoder to produce the output sequence. Each encoder layer includes a self-attention mechanism followed by a feed-forward network, while the decoder layers include additional mechanisms for cross-attention, allowing them to attend to the encoder's output. This design results in a highly parallelizable architecture that can handle large datasets efficiently.\n",
+ "## 3. Transformer's Architecture \n",
+ "The Transformer architecture consists of an encoder-decoder structure, where both components are built from identical layers. Each encoder layer contains two main sub-layers: a multi-head self-attention mechanism and a position-wise feed-forward network. The decoder, similarly, has these two sub-layers but includes an additional multi-head attention over the encoder output, allowing it to attend to the input sequence while generating the output. Importantly, residual connections and layer normalization are used around each sub-layer to facilitate training stability and speed up convergence. The unique architecture enables highly efficient parallelization, a crucial factor in its rapid adoption in large-scale applications.\n",
"\n",
- "### 3.2 Encoder-Decoder Structure \n",
- "The encoder-decoder structure of the Transformer is fundamental to its operation. The encoder receives the input sequence and transforms it into a continuous representation through multiple layers of self-attention and feed-forward networks. Each encoder layer refines the representation, capturing both local and global dependencies. The decoder, on the other hand, takes this representation and generates the output sequence step-by-step, using masked self-attention to prevent future information from influencing the current prediction. This structured approach allows for effective sequence generation and understanding in various tasks.\n",
+ "### 3.1. Multi-Head Attention \n",
+ "The multi-head attention mechanism is one of the core innovations of the Transformer architecture. It allows the model to jointly attend to information from different representation subspaces at different positions. By projecting the input into multiple heads independently, the model can learn a range of attention patterns and selectively focus on relevant parts of the input. Each head computes attention scores using queries, keys, and values, and the results are concatenated and linearly transformed into the output. This mechanism enhances the model’s ability to capture relationships and dependencies, significantly improving performance on various tasks.\n",
"\n",
- "### 3.3 Multi-Head Attention \n",
- "Multi-head attention is a vital feature of the Transformer architecture, enhancing its ability to focus on different parts of the input simultaneously. By using multiple attention heads, the model can capture various aspects of relationships within the input sequence. Each head operates independently and learns different representations, which are then concatenated and linearly transformed to produce the final output. This mechanism allows the Transformer to aggregate information from multiple perspectives, resulting in richer contextual representations. As a result, multi-head attention significantly improves the model's performance across a wide range of tasks.\n",
+ "### 3.2. Position-wise Feed-Forward Networks \n",
+ "Position-wise feed-forward networks (FFNs) are essential components of the Transformer model that enhance its representational capacity. Each position in the sequence is processed independently through a feed-forward neural network where the same weights are applied across all positions. Usually, this involves a two-layer network with a ReLU activation function, allowing the model to capture intricate patterns in the input data. The use of FFNs contributes to the overall expressiveness of the model, facilitating complex transformations of the input representations at each layer. This enables the Transformer to learn high-level abstractions in the data, improving its performance on tasks such as translation and summarization.\n",
"\n",
- "### 3.4 Positional Encoding \n",
- "Since the Transformer architecture does not inherently understand the order of sequences, positional encoding is introduced to provide information about the position of each word in the sequence. This encoding is added to the input embeddings, allowing the model to distinguish between different positions. Various strategies exist for positional encoding, including sine and cosine functions, which provide a continuous representation of position. By incorporating positional information, the Transformer can effectively model the sequential nature of language, improving its performance on tasks that require understanding of word order.\n",
+ "### 3.3. Positional Encoding \n",
+ "Positional encoding is a critical aspect of the Transformer architecture, compensating for the lack of inherent sequential order in the input data. Since the self-attention mechanism treats all input tokens equally, positional encodings are added to the input embeddings to provide information about token positions. This encoding can be learned or, more commonly, computed using sinusoidal functions. The sinusoidal approach allows the model to leverage the periodic nature of the encoding, enabling effective learning of relative positions among tokens. By integrating positional encodings, the Transformer retains the capacity to understand order and sequence, crucial for tasks involving sequential data.\n",
"\n",
"## 4. Training the Transformer \n",
- "### 4.1 Loss Functions \n",
- "Training the Transformer model involves optimizing a loss function that measures the difference between the predicted outputs and the actual target outputs. Commonly used loss functions for this purpose include cross-entropy loss, which is suitable for classification tasks. The loss is calculated for each predicted token in the output sequence, and the model is trained to minimize this loss using gradient descent techniques. The choice of loss function significantly impacts the model's learning dynamics and final performance, making it a crucial aspect of the training process. Additionally, regularization techniques may be applied to prevent overfitting and enhance generalization.\n",
+ "Training the Transformer architecture presents unique challenges and considerations. Unlike traditional architectures, the Transformer employs parallelization, allowing for faster training times. Success during training often relies on efficient loss functions that guide the learning process, with the commonly used cross-entropy loss being particularly effective for NLP tasks. Additionally, optimization techniques like learning rate schedules and transformers-specific optimizers such as Adam have been designed to improve convergence and handling of variances across multiple heads. Moreover, techniques such as dropout and early stopping help prevent overfitting and improve generalization during training.\n",
"\n",
- "### 4.2 Optimization Techniques \n",
- "Optimizing the Transformer architecture requires advanced techniques to handle its complexity and large parameter space. One popular optimization algorithm is Adam, which combines the benefits of adaptive learning rates and momentum. Learning rate scheduling is also commonly employed to adjust the learning rate dynamically during training, allowing for faster convergence. Techniques such as gradient clipping can be applied to prevent exploding gradients, which can hinder the training process. Overall, effective optimization strategies play a vital role in achieving optimal performance for Transformer models across various tasks.\n",
+ "### 4.1. Loss Functions \n",
+ "The choice of loss function is paramount to the success of training the Transformer architecture. Cross-entropy loss is the standard choice for tasks involving classification and sequence generation, as it measures the performance of a classification model whose output is a probability value between 0 and 1. In the context of NLP, this often entails measuring how well the model predicts the next word in a sentence given the previous context. Recently, alternatives such as label smoothing have also been introduced to enhance model performance by mitigating overconfidence in predictions. The selection and implementation of loss functions directly influence model performance, shaping how it learns from data throughout training.\n",
"\n",
- "## 5. Applications of Transformers \n",
- "### 5.1 Natural Language Processing \n",
- "Transformers have become the backbone of many state-of-the-art NLP models, including BERT, GPT, and T5. Their ability to handle long-range dependencies and parallelize training has led to significant advancements in tasks such as machine translation, sentiment analysis, and question answering. These models leverage the self-attention mechanism to understand context better and generate more coherent and contextually relevant outputs. Additionally, Transformers have been successfully fine-tuned for specific tasks, allowing them to achieve remarkable performance with relatively small amounts of task-specific data. As a result, Transformers have largely replaced traditional RNN-based architectures in the NLP domain.\n",
+ "### 4.2. Optimization Techniques \n",
+ "Optimization techniques used in training the Transformer architecture are pivotal to achieving high performance efficiently. Adam, a popular gradient-based optimization algorithm, has shown impressive results due to its adaptive learning rate capabilities. Additionally, techniques like learning rate warmup have become commonplace to stabilize training rates in the early stages. Regularization methods such as dropout and layer normalization further assist in managing overfitting and promote better convergence properties. The correct deployment of these optimization strategies is essential to harness the full potential of the Transformer model, especially when training on large datasets.\n",
"\n",
- "### 5.2 Vision Transformers \n",
- "Recently, the Transformer architecture has also been adapted for computer vision tasks, leading to the development of Vision Transformers (ViTs). These models apply the principles of the Transformer architecture to image data, treating image patches as sequences to be processed. By leveraging self-attention, ViTs can capture global representations of images, improving performance on tasks such as image classification and object detection. The success of Vision Transformers highlights the versatility of the architecture, demonstrating its applicability beyond text-based tasks. As a result, Transformers are increasingly being explored in the field of computer vision, opening new avenues for research and application.\n",
+ "## 5. Applications of Transformer Architecture \n",
+ "The Transformer architecture's flexibility and power have led to its deployment across a wide array of applications. In natural language processing, models like BERT and GPT leverage its capabilities for tasks such as sentiment analysis, text generation, and translation. Beyond NLP, the architecture's ability to capture important features has been successfully applied in computer vision, where models like Vision Transformers (ViTs) utilize the architecture to achieve state-of-the-art performance. Further, the architecture's advantages have extended into speech recognition, enhancing models designed to translate spoken language into written text. The breadth of applications underscores the transformative impact of the Transformer architecture across various fields.\n",
"\n",
- "## 6. Advantages and Limitations \n",
- "### 6.1 Advantages \n",
- "The Transformer architecture offers several advantages over traditional models, particularly in terms of efficiency and performance. Its ability to process data in parallel significantly speeds up training times, making it suitable for large-scale datasets. The self-attention mechanism enables the model to capture long-range dependencies effectively, leading to improved performance across various tasks. Additionally, the modular structure of Transformers allows for easy scalability and adaptability to different applications. These strengths have contributed to the widespread adoption of Transformers in machine learning research and industry applications.\n",
+ "### 5.1. Natural Language Processing \n",
+ "Natural language processing (NLP) has experienced a paradigm shift with the introduction of the Transformer architecture. Tasks such as machine translation, summarization, and sentiment classification have benefitted significantly from the self-attention mechanism and the model’s ability to process long-range dependencies. Models like BERT have redefined state-of-the-art performance benchmarks, tackling various NLP tasks efficiently by leveraging unsupervised learning on vast text corpora. Additionally, the flexibility of the Transformer architecture allows seamless adaptation to various NLP tasks, making it the foundation for many subsequent models and techniques. The advancements achieved in NLP as a result of the Transformer architecture continue to push the boundaries of what is possible in understanding and generating human language.\n",
"\n",
- "### 6.2 Limitations \n",
- "Despite their many advantages, Transformers also have certain limitations. Their large parameter size can lead to increased computational costs and memory requirements, making them less accessible for smaller organizations or applications with limited resources. Additionally, Transformers may struggle with tasks that require extensive fine-tuning or adaptation to specific domains, as their generalized training can sometimes lead to suboptimal performance. Furthermore, the lack of inherent inductive biases present in traditional convolutional or recurrent models can make Transformers less effective for certain structured data types. Addressing these limitations remains an active area of research.\n",
+ "### 5.2. Computer Vision \n",
+ "The Transformer architecture has also made significant inroads into the field of computer vision. Traditional convolutional neural networks (CNNs) dominated image classification tasks, but Vision Transformers (ViTs) have emerged as formidable competitors. By treating image patches as sequences, ViTs apply the Transformer’s self-attention mechanism to capture spatial relationships and patterns across the entire image. This approach has led to state-of-the-art results on image classification benchmarks, demonstrating the model's ability to generalize well from limited labeled data. As research in vision transformers continues to evolve, further innovations and enhancements promise to reshape the landscape of image analysis and understanding.\n",
"\n",
- "## 7. Future Directions \n",
- "The future of the Transformer architecture holds great promise, with ongoing research focusing on improving its efficiency and applicability. One area of interest is the development of more compact and efficient variants of Transformers, such as Sparse Transformers and Long-Short Transformers, which aim to reduce computational costs while maintaining performance. Additionally, researchers are exploring ways to integrate Transformers with other architectures, such as CNNs or RNNs, to leverage their strengths in hybrid models. Furthermore, the potential for Transformers in areas such as reinforcement learning and multimodal tasks continues to be an exciting avenue for exploration. As the field evolves, Transformers are likely to remain a central focus in advancing machine learning technologies.\n",
+ "### 5.3. Speech Recognition \n",
+ "Speech recognition technologies have similarly benefited from the application of Transformer architectures. The ability of Transformers to effectively process sequential data has rendered them invaluable in translating spoken language into written text. Hybrid models that combine traditional temporal convolutional networks with Transformer components have shown marked improvements in accuracy. Additionally, the self-attention mechanism allows these models to focus on relevant segments of audio input, crucial for decoding nuanced speech patterns. As attention-based mechanisms gain traction in this domain, the performance of speech recognition systems is expected to improve dramatically, expanding their applicability.\n",
"\n",
- "## 8. Conclusion \n",
- "In summary, the Transformer architecture has transformed the landscape of machine learning, particularly in natural language processing and computer vision. Its innovative use of self-attention mechanisms and encoder-decoder structures has enabled significant advancements in various tasks, paving the way for state-of-the-art models. While the architecture presents certain challenges, its advantages and versatility make it a powerful tool in the machine learning toolkit. As research continues to advance, the future of Transformers looks bright, with potential applications extending beyond current domains. This paper has provided a comprehensive overview of the Transformer architecture, highlighting its importance in modern machine learning.\n",
+ "## 6. Conclusion \n",
+ "In conclusion, the Transformer architecture represents a groundbreaking leap in deep learning paradigms, providing a robust and efficient framework for handling various tasks across multiple domains. Its self-attention mechanism, combined with components conducive to parallelization, allows for significant performance improvements over previous architectures. As research continues to explore the potential of Transformers, new architectures and enhancements build upon the principles established by the original model. The wide array of applications, ranging from natural language processing to computer vision and speech recognition, underscores its versatility and significance in the contemporary machine learning landscape. The future of deep learning remains bright, with the Transformer architecture at the forefront of innovation and development.\n",
" file_name: transformer_architecture_analysis.md\n",
"==================================================\n",
"\n",
@@ -1270,7 +1160,7 @@
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"\n",
- "The in-depth analysis of the Transformer architecture has been completed and saved as \"transformer_architecture_analysis.md\". If you need further modifications or additional sections, feel free to ask!\n",
+ "The in-depth analysis of the Transformer architecture has been successfully compiled and saved in a document titled \"transformer_architecture_analysis.md\". The document includes a comprehensive table of contents and elaborates on various aspects of the Transformer architecture as requested. If you need any further modifications or additional sections, feel free to ask!\n",
"==================================================\n",
"\n",
"==================================================\n",
@@ -1279,570 +1169,98 @@
"================================\u001b[1m Human Message \u001b[0m=================================\n",
"Name: DocWriter\n",
"\n",
- "The in-depth analysis of the Transformer architecture has been completed and saved as \"transformer_architecture_analysis.md\". If you need further modifications or additional sections, feel free to ask!\n",
+ "The in-depth analysis of the Transformer architecture has been successfully compiled and saved in a document titled \"transformer_architecture_analysis.md\". The document includes a comprehensive table of contents and elaborates on various aspects of the Transformer architecture as requested. If you need any further modifications or additional sections, feel free to ask!\n",
"==================================================\n",
"\n",
"==================================================\n",
"🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"\u001b[1;32mnext\u001b[0m:\n",
- "NoteTaker\n",
- "==================================================\n",
+ "ChartGenerator\n",
+ "==================================================\n"
+ ]
+ },
+ {
+ "name": "stderr",
+ "output_type": "stream",
+ "text": [
+ "WARNING:langchain_experimental.utilities.python:Python REPL can execute arbitrary code. Use with caution.\n"
+ ]
+ },
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"Tool Calls:\n",
- " create_outline (call_DBmJAAoZMUQoOGlgvDhmYbbm)\n",
- " Call ID: call_DBmJAAoZMUQoOGlgvDhmYbbm\n",
+ " Python_REPL (call_efCveTCHpS0U6p3XhW7X5Egy)\n",
+ " Call ID: call_efCveTCHpS0U6p3XhW7X5Egy\n",
" Args:\n",
- " points: ['Introduction to Transformer Architecture', 'Key Components of Transformer', 'Self-Attention Mechanism', 'Positional Encoding', 'Multi-Head Attention', 'Feed Forward Neural Networks', 'Layer Normalization and Residual Connections', 'Training Techniques and Optimizations', 'Applications of Transformer Architecture', 'Conclusion']\n",
- " file_name: transformer_architecture_analysis.md\n",
+ " query: import datetime\n",
+ "\n",
+ "datetime.datetime.now().isoformat()\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
+ "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"=================================\u001b[1m Tool Message \u001b[0m=================================\n",
- "Name: create_outline\n",
+ "Name: Python_REPL\n",
"\n",
- "Outline saved to transformer_architecture_analysis.md\n",
- "==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
- "Tool Calls:\n",
- " read_document (call_uqxDLfkQy5x3ZuVQkrDgmLus)\n",
- " Call ID: call_uqxDLfkQy5x3ZuVQkrDgmLus\n",
- " Args:\n",
- " file_name: transformer_architecture_analysis.md\n",
- " start: 0\n",
- " end: 20\n",
- " read_document (call_KXWVlmvtpuobQNoDtvb4gpxx)\n",
- " Call ID: call_KXWVlmvtpuobQNoDtvb4gpxx\n",
- " Args:\n",
- " file_name: transformer_architecture_analysis.md\n",
- " start: 20\n",
- " end: 40\n",
- " read_document (call_EEVbcNq3TjyKpJiJwSBXIiUN)\n",
- " Call ID: call_EEVbcNq3TjyKpJiJwSBXIiUN\n",
- " Args:\n",
- " file_name: transformer_architecture_analysis.md\n",
- " start: 40\n",
- " end: 60\n",
- " read_document (call_Y6NskEoYFBpO7o33TvI8BXOe)\n",
- " Call ID: call_Y6NskEoYFBpO7o33TvI8BXOe\n",
- " Args:\n",
- " file_name: transformer_architecture_analysis.md\n",
- " start: 60\n",
- " end: 80\n",
- " read_document (call_VAEEQ2U9S9NqOWsb2BRzpEf1)\n",
- " Call ID: call_VAEEQ2U9S9NqOWsb2BRzpEf1\n",
- " Args:\n",
- " file_name: transformer_architecture_analysis.md\n",
- " start: 80\n",
- " end: 100\n",
- "==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
- "Name: read_document\n",
- "\n",
- "1. Introduction to Transformer Architecture\n",
- "\n",
- "2. Key Components of Transformer\n",
- "\n",
- "3. Self-Attention Mechanism\n",
- "\n",
- "4. Positional Encoding\n",
- "\n",
- "5. Multi-Head Attention\n",
- "\n",
- "6. Feed Forward Neural Networks\n",
- "\n",
- "7. Layer Normalization and Residual Connections\n",
- "\n",
- "8. Training Techniques and Optimizations\n",
- "\n",
- "9. Applications of Transformer Architecture\n",
- "\n",
- "10. Conclusion\n",
- "\n",
- "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
- "Name: read_document\n",
- "\n",
- "1. Introduction to Transformer Architecture\n",
- "\n",
- "2. Key Components of Transformer\n",
- "\n",
- "3. Self-Attention Mechanism\n",
- "\n",
- "4. Positional Encoding\n",
- "\n",
- "5. Multi-Head Attention\n",
- "\n",
- "6. Feed Forward Neural Networks\n",
- "\n",
- "7. Layer Normalization and Residual Connections\n",
- "\n",
- "8. Training Techniques and Optimizations\n",
- "\n",
- "9. Applications of Transformer Architecture\n",
- "\n",
- "10. Conclusion\n",
- "\n",
- "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
- "Name: read_document\n",
- "\n",
- "1. Introduction to Transformer Architecture\n",
- "\n",
- "2. Key Components of Transformer\n",
- "\n",
- "3. Self-Attention Mechanism\n",
- "\n",
- "4. Positional Encoding\n",
- "\n",
- "5. Multi-Head Attention\n",
- "\n",
- "6. Feed Forward Neural Networks\n",
- "\n",
- "7. Layer Normalization and Residual Connections\n",
- "\n",
- "8. Training Techniques and Optimizations\n",
- "\n",
- "9. Applications of Transformer Architecture\n",
- "\n",
- "10. Conclusion\n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"\n",
- "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
- "Name: read_document\n",
+ "The analysis of the Transformer architecture has been completed successfully. Here is the table of contents included in the document:\n",
"\n",
+ "### Table of Contents\n",
"1. Introduction to Transformer Architecture\n",
- "\n",
"2. Key Components of Transformer\n",
+ " - 2.1 Multi-Head Self-Attention\n",
+ " - 2.2 Position-wise Feed-Forward Networks\n",
+ " - 2.3 Positional Encoding\n",
+ "3. The Encoder-Decoder Structure\n",
+ "4. Training Strategies for Transformers\n",
+ "5. Applications of Transformer Architecture\n",
+ "6. Advantages and Limitations\n",
+ "7. Conclusion\n",
"\n",
- "3. Self-Attention Mechanism\n",
- "\n",
- "4. Positional Encoding\n",
- "\n",
- "5. Multi-Head Attention\n",
- "\n",
- "6. Feed Forward Neural Networks\n",
- "\n",
- "7. Layer Normalization and Residual Connections\n",
- "\n",
- "8. Training Techniques and Optimizations\n",
- "\n",
- "9. Applications of Transformer Architecture\n",
+ "If you need to review the contents of any specific section or make further modifications, please let me know!\n",
+ "==================================================\n",
"\n",
- "10. Conclusion\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mChartGenerator\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "================================\u001b[1m Human Message \u001b[0m=================================\n",
+ "Name: ChartGenerator\n",
"\n",
- "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
- "Name: read_document\n",
+ "The analysis of the Transformer architecture has been completed successfully. Here is the table of contents included in the document:\n",
"\n",
+ "### Table of Contents\n",
"1. Introduction to Transformer Architecture\n",
- "\n",
"2. Key Components of Transformer\n",
+ " - 2.1 Multi-Head Self-Attention\n",
+ " - 2.2 Position-wise Feed-Forward Networks\n",
+ " - 2.3 Positional Encoding\n",
+ "3. The Encoder-Decoder Structure\n",
+ "4. Training Strategies for Transformers\n",
+ "5. Applications of Transformer Architecture\n",
+ "6. Advantages and Limitations\n",
+ "7. Conclusion\n",
"\n",
- "3. Self-Attention Mechanism\n",
- "\n",
- "4. Positional Encoding\n",
- "\n",
- "5. Multi-Head Attention\n",
- "\n",
- "6. Feed Forward Neural Networks\n",
- "\n",
- "7. Layer Normalization and Residual Connections\n",
- "\n",
- "8. Training Techniques and Optimizations\n",
- "\n",
- "9. Applications of Transformer Architecture\n",
- "\n",
- "10. Conclusion\n",
- "\n",
+ "If you need to review the contents of any specific section or make further modifications, please let me know!\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
- "Tool Calls:\n",
- " create_outline (call_3r6OKoqIN4EAp3CqEa3vqymy)\n",
- " Call ID: call_3r6OKoqIN4EAp3CqEa3vqymy\n",
- " Args:\n",
- " points: ['Introduction to Transformer Architecture', 'Key Components of Transformer', 'Self-Attention Mechanism', 'Positional Encoding', 'Multi-Head Attention', 'Feed Forward Neural Networks', 'Layer Normalization and Residual Connections', 'Training Techniques and Optimizations', 'Applications of Transformer Architecture', 'Conclusion']\n",
- " file_name: transformer_architecture_analysis.md\n",
- " read_document (call_wea1e6cro0FYmeZ3FOydQ8mV)\n",
- " Call ID: call_wea1e6cro0FYmeZ3FOydQ8mV\n",
- " Args:\n",
- " file_name: transformer_architecture_analysis.md\n",
- " start: 0\n",
- " end: 20\n",
- " read_document (call_olQWRblHGKkCEnxBnBTaHKUe)\n",
- " Call ID: call_olQWRblHGKkCEnxBnBTaHKUe\n",
- " Args:\n",
- " file_name: transformer_architecture_analysis.md\n",
- " start: 20\n",
- " end: 40\n",
- " read_document (call_GNwrLowNkMHuEgr2tW8lqsmg)\n",
- " Call ID: call_GNwrLowNkMHuEgr2tW8lqsmg\n",
- " Args:\n",
- " file_name: transformer_architecture_analysis.md\n",
- " start: 40\n",
- " end: 60\n",
- " read_document (call_Qq2bR8cZjnxDWBjVXRTfMbmS)\n",
- " Call ID: call_Qq2bR8cZjnxDWBjVXRTfMbmS\n",
- " Args:\n",
- " file_name: transformer_architecture_analysis.md\n",
- " start: 60\n",
- " end: 80\n",
- " read_document (call_0ZtxqE3dhsqrxr3OFGNcSTb2)\n",
- " Call ID: call_0ZtxqE3dhsqrxr3OFGNcSTb2\n",
- " Args:\n",
- " file_name: transformer_architecture_analysis.md\n",
- " start: 80\n",
- " end: 100\n",
- "==================================================\n",
- "\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
- "Name: create_outline\n",
- "\n",
- "Outline saved to transformer_architecture_analysis.md\n",
- "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
- "Name: read_document\n",
- "\n",
- "1. Introduction to Transformer Architecture\n",
- "\n",
- "2. Key Components of Transformer\n",
- "\n",
- "3. Self-Attention Mechanism\n",
- "\n",
- "4. Positional Encoding\n",
- "\n",
- "5. Multi-Head Attention\n",
- "\n",
- "6. Feed Forward Neural Networks\n",
- "\n",
- "7. Layer Normalization and Residual Connections\n",
- "\n",
- "8. Training Techniques and Optimizations\n",
- "\n",
- "9. Applications of Transformer Architecture\n",
- "\n",
- "10. Conclusion\n",
- "\n",
- "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
- "Name: read_document\n",
- "\n",
- "1. Introduction to Transformer Architecture\n",
- "\n",
- "2. Key Components of Transformer\n",
- "\n",
- "3. Self-Attention Mechanism\n",
- "\n",
- "4. Positional Encoding\n",
- "\n",
- "5. Multi-Head Attention\n",
- "\n",
- "6. Feed Forward Neural Networks\n",
- "\n",
- "7. Layer Normalization and Residual Connections\n",
- "\n",
- "8. Training Techniques and Optimizations\n",
- "\n",
- "9. Applications of Transformer Architecture\n",
- "\n",
- "10. Conclusion\n",
- "\n",
- "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
- "Name: read_document\n",
- "\n",
- "1. Introduction to Transformer Architecture\n",
- "\n",
- "2. Key Components of Transformer\n",
- "\n",
- "3. Self-Attention Mechanism\n",
- "\n",
- "4. Positional Encoding\n",
- "\n",
- "5. Multi-Head Attention\n",
- "\n",
- "6. Feed Forward Neural Networks\n",
- "\n",
- "7. Layer Normalization and Residual Connections\n",
- "\n",
- "8. Training Techniques and Optimizations\n",
- "\n",
- "9. Applications of Transformer Architecture\n",
- "\n",
- "10. Conclusion\n",
- "\n",
- "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
- "Name: read_document\n",
- "\n",
- "1. Introduction to Transformer Architecture\n",
- "\n",
- "2. Key Components of Transformer\n",
- "\n",
- "3. Self-Attention Mechanism\n",
- "\n",
- "4. Positional Encoding\n",
- "\n",
- "5. Multi-Head Attention\n",
- "\n",
- "6. Feed Forward Neural Networks\n",
- "\n",
- "7. Layer Normalization and Residual Connections\n",
- "\n",
- "8. Training Techniques and Optimizations\n",
- "\n",
- "9. Applications of Transformer Architecture\n",
- "\n",
- "10. Conclusion\n",
- "\n",
- "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
- "Name: read_document\n",
- "\n",
- "1. Introduction to Transformer Architecture\n",
- "\n",
- "2. Key Components of Transformer\n",
- "\n",
- "3. Self-Attention Mechanism\n",
- "\n",
- "4. Positional Encoding\n",
- "\n",
- "5. Multi-Head Attention\n",
- "\n",
- "6. Feed Forward Neural Networks\n",
- "\n",
- "7. Layer Normalization and Residual Connections\n",
- "\n",
- "8. Training Techniques and Optimizations\n",
- "\n",
- "9. Applications of Transformer Architecture\n",
- "\n",
- "10. Conclusion\n",
- "\n",
- "==================================================\n",
- "\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mNoteTaker\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
- "\n",
- "Here's an in-depth analysis of the Transformer architecture, structured with a table of contents and elaborated sections:\n",
- "\n",
- "# Table of Contents\n",
- "\n",
- "1. **Introduction to Transformer Architecture**\n",
- "2. **Key Components of Transformer**\n",
- "3. **Self-Attention Mechanism**\n",
- "4. **Positional Encoding**\n",
- "5. **Multi-Head Attention**\n",
- "6. **Feed Forward Neural Networks**\n",
- "7. **Layer Normalization and Residual Connections**\n",
- "8. **Training Techniques and Optimizations**\n",
- "9. **Applications of Transformer Architecture**\n",
- "10. **Conclusion**\n",
- "\n",
- "---\n",
- "\n",
- "## 1. Introduction to Transformer Architecture\n",
- "The Transformer architecture was introduced in the paper \"Attention is All You Need\" by Vaswani et al. in 2017. It revolutionized natural language processing by eliminating the need for recurrent neural networks (RNNs). Instead of processing input sequentially, Transformers allow for parallelization, leading to faster training times. The core innovation of the Transformer is the self-attention mechanism, which enables the model to weigh the importance of different words in a sentence relative to one another. This allows Transformers to capture long-range dependencies in data more effectively than previous models.\n",
- "\n",
- "## 2. Key Components of Transformer\n",
- "The Transformer architecture consists of two main parts: the encoder and the decoder. The encoder processes the input data and generates a continuous representation, while the decoder produces the output sequence based on the encoder’s output. Each encoder and decoder is made up of a stack of identical layers, with the encoder typically having six layers and the decoder having a similar structure. The key components of each layer include multi-head self-attention, feed-forward neural networks, layer normalization, and residual connections. This layered approach enables the model to learn complex patterns and relationships in the data.\n",
- "\n",
- "## 3. Self-Attention Mechanism\n",
- "Self-attention is a pivotal feature of the Transformer architecture that allows the model to evaluate the relevance of different words in a sentence to each other. In this mechanism, each word in the input sequence is represented as a query, key, and value. The attention score is calculated by taking the dot product of the query with all keys, followed by a softmax operation to obtain the weights. These weights are then used to compute a weighted sum of the values, producing a context-aware representation of each word. This process allows the model to focus on relevant parts of the input when making predictions.\n",
- "\n",
- "## 4. Positional Encoding\n",
- "Since Transformers do not inherently understand the order of the input sequence due to their parallel processing nature, positional encoding is introduced to provide information about the relative position of words in a sentence. This encoding is added to the input embeddings to ensure that the model can differentiate between similar words in different contexts. Positional encodings can be generated using sinusoidal functions, where each dimension of the encoding corresponds to a different frequency. This method helps the model leverage the sequential information needed for tasks such as translation and summarization.\n",
- "\n",
- "## 5. Multi-Head Attention\n",
- "Multi-head attention enhances the self-attention mechanism by allowing the model to jointly attend to information from different representation subspaces at different positions. Instead of having a single set of attention weights, multi-head attention splits the input into multiple heads, each learning different aspects of the input. This is achieved by linearly transforming the input into multiple queries, keys, and values, which are then processed in parallel. The outputs from each head are concatenated and linearly transformed to produce the final output. This approach improves the model's ability to capture various relationships in the data.\n",
- "\n",
- "## 6. Feed Forward Neural Networks\n",
- "In addition to self-attention layers, each encoder and decoder layer contains a position-wise feed-forward neural network (FFN). This network consists of two linear transformations with a ReLU activation in between. The FFN processes each position independently and identically, allowing the model to learn complex representations at each layer. The use of non-linear activations in the FFN enables the model to capture more intricate patterns in the data. The dimensionality of the FFN is typically much larger than the input embeddings, allowing for richer feature extraction.\n",
- "\n",
- "## 7. Layer Normalization and Residual Connections\n",
- "Layer normalization is applied to stabilize the learning process and improve convergence during training. It normalizes the inputs across the features, helping to mitigate issues related to internal covariate shift. Residual connections are also employed, allowing gradients to flow through the network more easily, which is particularly beneficial in very deep networks. By adding the input of each sub-layer to its output, residual connections help prevent vanishing gradients and enable the model to learn more effectively. This combination of techniques enhances the overall performance and stability of the Transformer.\n",
- "\n",
- "## 8. Training Techniques and Optimizations\n",
- "Training Transformers involves several techniques to optimize performance. One common approach is to use the Adam optimizer with a warm-up learning rate schedule, which gradually increases the learning rate before decaying it. Additionally, techniques such as dropout and label smoothing are implemented to prevent overfitting and improve generalization. Large-scale pre-training on vast amounts of data followed by fine-tuning on specific tasks has proven effective for Transformer models. These optimization strategies contribute significantly to the state-of-the-art performance of Transformers in various applications.\n",
- "\n",
- "## 9. Applications of Transformer Architecture\n",
- "Transformers have been successfully applied across numerous domains beyond natural language processing, including computer vision, speech recognition, and even music generation. In NLP, they power models like BERT, GPT, and T5, excelling in tasks such as machine translation, text summarization, and sentiment analysis. In computer vision, Vision Transformers (ViTs) have demonstrated competitive performance against convolutional neural networks (CNNs). The flexibility and scalability of the Transformer architecture make it a preferred choice for many machine learning tasks, leading to ongoing research and development in this area.\n",
- "\n",
- "## 10. Conclusion\n",
- "The Transformer architecture has fundamentally changed the landscape of machine learning and artificial intelligence. Its innovative use of self-attention, combined with a flexible structure of encoders and decoders, enables it to capture complex relationships within data efficiently. As research continues to evolve, the Transformer architecture is likely to inspire new models and techniques, further pushing the boundaries of what can be achieved in various fields. The impact of Transformers is profound, and their versatility ensures they will remain central to advancements in AI for years to come.\n",
- "\n",
- "---\n",
- "\n",
- "The above content has been saved in the file **transformer_architecture_analysis.md**. If you have further requests or modifications, feel free to ask!\n",
- "==================================================\n",
- "\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mNoteTaker\u001b[0m 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "================================\u001b[1m Human Message \u001b[0m=================================\n",
- "Name: NoteTaker\n",
- "\n",
- "Here's an in-depth analysis of the Transformer architecture, structured with a table of contents and elaborated sections:\n",
- "\n",
- "# Table of Contents\n",
- "\n",
- "1. **Introduction to Transformer Architecture**\n",
- "2. **Key Components of Transformer**\n",
- "3. **Self-Attention Mechanism**\n",
- "4. **Positional Encoding**\n",
- "5. **Multi-Head Attention**\n",
- "6. **Feed Forward Neural Networks**\n",
- "7. **Layer Normalization and Residual Connections**\n",
- "8. **Training Techniques and Optimizations**\n",
- "9. **Applications of Transformer Architecture**\n",
- "10. **Conclusion**\n",
- "\n",
- "---\n",
- "\n",
- "## 1. Introduction to Transformer Architecture\n",
- "The Transformer architecture was introduced in the paper \"Attention is All You Need\" by Vaswani et al. in 2017. It revolutionized natural language processing by eliminating the need for recurrent neural networks (RNNs). Instead of processing input sequentially, Transformers allow for parallelization, leading to faster training times. The core innovation of the Transformer is the self-attention mechanism, which enables the model to weigh the importance of different words in a sentence relative to one another. This allows Transformers to capture long-range dependencies in data more effectively than previous models.\n",
- "\n",
- "## 2. Key Components of Transformer\n",
- "The Transformer architecture consists of two main parts: the encoder and the decoder. The encoder processes the input data and generates a continuous representation, while the decoder produces the output sequence based on the encoder’s output. Each encoder and decoder is made up of a stack of identical layers, with the encoder typically having six layers and the decoder having a similar structure. The key components of each layer include multi-head self-attention, feed-forward neural networks, layer normalization, and residual connections. This layered approach enables the model to learn complex patterns and relationships in the data.\n",
- "\n",
- "## 3. Self-Attention Mechanism\n",
- "Self-attention is a pivotal feature of the Transformer architecture that allows the model to evaluate the relevance of different words in a sentence to each other. In this mechanism, each word in the input sequence is represented as a query, key, and value. The attention score is calculated by taking the dot product of the query with all keys, followed by a softmax operation to obtain the weights. These weights are then used to compute a weighted sum of the values, producing a context-aware representation of each word. This process allows the model to focus on relevant parts of the input when making predictions.\n",
- "\n",
- "## 4. Positional Encoding\n",
- "Since Transformers do not inherently understand the order of the input sequence due to their parallel processing nature, positional encoding is introduced to provide information about the relative position of words in a sentence. This encoding is added to the input embeddings to ensure that the model can differentiate between similar words in different contexts. Positional encodings can be generated using sinusoidal functions, where each dimension of the encoding corresponds to a different frequency. This method helps the model leverage the sequential information needed for tasks such as translation and summarization.\n",
- "\n",
- "## 5. Multi-Head Attention\n",
- "Multi-head attention enhances the self-attention mechanism by allowing the model to jointly attend to information from different representation subspaces at different positions. Instead of having a single set of attention weights, multi-head attention splits the input into multiple heads, each learning different aspects of the input. This is achieved by linearly transforming the input into multiple queries, keys, and values, which are then processed in parallel. The outputs from each head are concatenated and linearly transformed to produce the final output. This approach improves the model's ability to capture various relationships in the data.\n",
- "\n",
- "## 6. Feed Forward Neural Networks\n",
- "In addition to self-attention layers, each encoder and decoder layer contains a position-wise feed-forward neural network (FFN). This network consists of two linear transformations with a ReLU activation in between. The FFN processes each position independently and identically, allowing the model to learn complex representations at each layer. The use of non-linear activations in the FFN enables the model to capture more intricate patterns in the data. The dimensionality of the FFN is typically much larger than the input embeddings, allowing for richer feature extraction.\n",
- "\n",
- "## 7. Layer Normalization and Residual Connections\n",
- "Layer normalization is applied to stabilize the learning process and improve convergence during training. It normalizes the inputs across the features, helping to mitigate issues related to internal covariate shift. Residual connections are also employed, allowing gradients to flow through the network more easily, which is particularly beneficial in very deep networks. By adding the input of each sub-layer to its output, residual connections help prevent vanishing gradients and enable the model to learn more effectively. This combination of techniques enhances the overall performance and stability of the Transformer.\n",
- "\n",
- "## 8. Training Techniques and Optimizations\n",
- "Training Transformers involves several techniques to optimize performance. One common approach is to use the Adam optimizer with a warm-up learning rate schedule, which gradually increases the learning rate before decaying it. Additionally, techniques such as dropout and label smoothing are implemented to prevent overfitting and improve generalization. Large-scale pre-training on vast amounts of data followed by fine-tuning on specific tasks has proven effective for Transformer models. These optimization strategies contribute significantly to the state-of-the-art performance of Transformers in various applications.\n",
- "\n",
- "## 9. Applications of Transformer Architecture\n",
- "Transformers have been successfully applied across numerous domains beyond natural language processing, including computer vision, speech recognition, and even music generation. In NLP, they power models like BERT, GPT, and T5, excelling in tasks such as machine translation, text summarization, and sentiment analysis. In computer vision, Vision Transformers (ViTs) have demonstrated competitive performance against convolutional neural networks (CNNs). The flexibility and scalability of the Transformer architecture make it a preferred choice for many machine learning tasks, leading to ongoing research and development in this area.\n",
- "\n",
- "## 10. Conclusion\n",
- "The Transformer architecture has fundamentally changed the landscape of machine learning and artificial intelligence. Its innovative use of self-attention, combined with a flexible structure of encoders and decoders, enables it to capture complex relationships within data efficiently. As research continues to evolve, the Transformer architecture is likely to inspire new models and techniques, further pushing the boundaries of what can be achieved in various fields. The impact of Transformers is profound, and their versatility ensures they will remain central to advancements in AI for years to come.\n",
- "\n",
- "---\n",
- "\n",
- "The above content has been saved in the file **transformer_architecture_analysis.md**. If you have further requests or modifications, feel free to ask!\n",
- "==================================================\n",
- "\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "\u001b[1;32mnext\u001b[0m:\n",
- "ChartGenerator\n",
- "==================================================\n",
- "\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
- "Tool Calls:\n",
- " read_document (call_9K89hCAgUjbJDhXBpfmUJby7)\n",
- " Call ID: call_9K89hCAgUjbJDhXBpfmUJby7\n",
- " Args:\n",
- " file_name: transformer_architecture_analysis.md\n",
- "==================================================\n",
- "\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mtools\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "=================================\u001b[1m Tool Message \u001b[0m=================================\n",
- "Name: read_document\n",
- "\n",
- "1. Introduction to Transformer Architecture\n",
- "\n",
- "2. Key Components of Transformer\n",
- "\n",
- "3. Self-Attention Mechanism\n",
- "\n",
- "4. Positional Encoding\n",
- "\n",
- "5. Multi-Head Attention\n",
- "\n",
- "6. Feed Forward Neural Networks\n",
- "\n",
- "7. Layer Normalization and Residual Connections\n",
- "\n",
- "8. Training Techniques and Optimizations\n",
- "\n",
- "9. Applications of Transformer Architecture\n",
- "\n",
- "10. Conclusion\n",
- "\n",
- "==================================================\n",
- "\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mChartGenerator\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
- "\n",
- "The in-depth analysis of the Transformer architecture has been successfully saved and includes the following content:\n",
- "\n",
- "# Table of Contents\n",
- "\n",
- "1. **Introduction to Transformer Architecture**\n",
- "2. **Key Components of Transformer**\n",
- "3. **Self-Attention Mechanism**\n",
- "4. **Positional Encoding**\n",
- "5. **Multi-Head Attention**\n",
- "6. **Feed Forward Neural Networks**\n",
- "7. **Layer Normalization and Residual Connections**\n",
- "8. **Training Techniques and Optimizations**\n",
- "9. **Applications of Transformer Architecture**\n",
- "10. **Conclusion**\n",
- "\n",
- "If you need any modifications, additional sections, or specific charts for better visual explanations, feel free to ask!\n",
- "==================================================\n",
- "\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mChartGenerator\u001b[0m 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "================================\u001b[1m Human Message \u001b[0m=================================\n",
- "Name: ChartGenerator\n",
- "\n",
- "The in-depth analysis of the Transformer architecture has been successfully saved and includes the following content:\n",
- "\n",
- "# Table of Contents\n",
- "\n",
- "1. **Introduction to Transformer Architecture**\n",
- "2. **Key Components of Transformer**\n",
- "3. **Self-Attention Mechanism**\n",
- "4. **Positional Encoding**\n",
- "5. **Multi-Head Attention**\n",
- "6. **Feed Forward Neural Networks**\n",
- "7. **Layer Normalization and Residual Connections**\n",
- "8. **Training Techniques and Optimizations**\n",
- "9. **Applications of Transformer Architecture**\n",
- "10. **Conclusion**\n",
- "\n",
- "If you need any modifications, additional sections, or specific charts for better visual explanations, feel free to ask!\n",
- "==================================================\n",
- "\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"\u001b[1;32mnext\u001b[0m:\n",
"FINISH\n",
@@ -1875,7 +1293,7 @@
},
{
"cell_type": "code",
- "execution_count": 48,
+ "execution_count": 22,
"metadata": {
"id": "LIaF15xaMfby"
},
@@ -1912,7 +1330,7 @@
},
{
"cell_type": "code",
- "execution_count": 49,
+ "execution_count": 23,
"metadata": {
"id": "i6qRY6ReMulX"
},
@@ -1957,7 +1375,7 @@
},
{
"cell_type": "code",
- "execution_count": 50,
+ "execution_count": 24,
"metadata": {
"id": "fnPVUMA6MyQZ"
},
@@ -2004,9 +1422,14 @@
},
{
"cell_type": "code",
- "execution_count": 51,
+ "execution_count": 25,
"metadata": {
- "id": "GzvNFQ1-M0dR"
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 522
+ },
+ "id": "GzvNFQ1-M0dR",
+ "outputId": "abe38a1e-8026-4f3c-e6e5-03e7b557277a"
},
"outputs": [
{
@@ -2027,9 +1450,13 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 27,
"metadata": {
- "id": "nFts6LksM2H2"
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "nFts6LksM2H2",
+ "outputId": "d94fa7da-9bdf-4895-fc6a-6c6333d33fc4"
},
"outputs": [
{
@@ -2056,8 +1483,8 @@
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"Tool Calls:\n",
- " tavily_search_results_json (call_NPfH6l3J8Z2KMdLmfj4hN3dW)\n",
- " Call ID: call_NPfH6l3J8Z2KMdLmfj4hN3dW\n",
+ " tavily_search_results_json (call_QoPI1KdwQJeLRDkdDDwAUqef)\n",
+ " Call ID: call_QoPI1KdwQJeLRDkdDDwAUqef\n",
" Args:\n",
" query: multi-agent architecture complex tasks report\n",
"==================================================\n",
@@ -2068,7 +1495,7 @@
"=================================\u001b[1m Tool Message \u001b[0m=================================\n",
"Name: tavily_search_results_json\n",
"\n",
- "[{\"url\": \"https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/\", \"content\": \"In this work, we introduce Magentic-One, a high-performing open-source agentic system for solving such tasks. Magentic-One uses a multi-agent architecture where a lead agent, the Orchestrator, plans, tracks progress, and re-plans to recover from errors. Throughout task execution, the Orchestrator also directs other specialized agents to perform\"}, {\"url\": \"https://www.researchgate.net/publication/385630524_Magentic-One_A_Generalist_Multi-Agent_System_for_Solving_Complex_Tasks\", \"content\": \"Magentic-One is a generalist multi-agen t system for autonomously completing complex tasks. The team's work is coordinated by an Orchestrator agen t, resp onsible for task decomposition\"}, {\"url\": \"https://www.microsoft.com/en-us/research/articles/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/\", \"content\": \"Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks - Microsoft Research We are also releasing an open-source implementation of Magentic-One (opens in new tab) on Microsoft AutoGen, our popular open-source framework for developing multi-agent applications. We see valuable new directions in agentic, safety and Responsible AI research: In terms of anticipating new risks from agentic systems, it is possible that agents will be subject to the same phishing, social engineering, and misinformation attacks that target human web surfers when they are acting on the public web. To conclude, in this work we introduced Magentic-One, a generalist multi-agent system represents a significant development in agentic systems capable of solving open-ended tasks.\"}, {\"url\": \"https://arxiv.org/abs/2411.04468\", \"content\": \"[2411.04468] Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks Title:Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks View a PDF of the paper titled Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks, by Adam Fourney and 18 other authors In this work, we introduce Magentic-One, a high-performing open-source agentic system for solving such tasks. We provide an open-source implementation of Magentic-One, and we include AutoGenBench, a standalone tool for agentic evaluation. View a PDF of the paper titled Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks, by Adam Fourney and 18 other authors Bibliographic Explorer Toggle Connected Papers Toggle Litmaps Toggle scite.ai Toggle alphaXiv Toggle Links to Code Toggle DagsHub Toggle GotitPub Toggle Huggingface Toggle Links to Code Toggle ScienceCast Toggle Replicate Toggle\"}, {\"url\": \"https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132\", \"content\": \"To effectively address these challenges, we can design specific agents for tasks such as Market Basket Analysis, Clickstream Analysis, Inventory Analysis, Customer Analysis, and Product Recommendations. Working Mechanism: This agent uses the Assistant API to create custom SQL queries, extracting data from transaction logs and analyzing purchase correlations. Working Mechanism: This agent uses the Assistant API to pull data from CRM systems and customer feedback, analyzing user attributes and segmenting customers based on their behavior. Example: The agent collects data on popular products and customer preferences to recommend items that align with a user’s past behavior and interests. As last thoughts, the combination of RAG frameworks and Agentic AI can revolutionize user engagement by delivering personalization, improving communication, customer retention and ensuring quick problem resolution.\"}]\n",
+ "[{\"url\": \"https://blog.dragonscale.ai/architectures-for-ai-agents/\", \"content\": \"AI agents are systems capable of reasoning, planning, and executing tasks autonomously. MAS involve multiple agents, each potentially equipped with different language models and tools, working collaboratively to solve complex tasks. An MAS architecture can segment task execution into distinct phases, such as recruitment, decision-making, agent execution, and evaluation. The publish-subscribe mechanism further streamlines communication by allowing agents to subscribe only to the information pertinent to their tasks. Effective AI agents must possess robust reasoning abilities to interact with complex environments, make informed decisions, and adapt to new information dynamically. MAS excel at managing parallel tasks, allowing different agents to work on separate subproblems simultaneously. While basic agents are well-suited for straightforward tasks with clearly defined tools, they often fall short in more complex and dynamic environments.\"}, {\"url\": \"https://www.researchgate.net/publication/385630524_Magentic-One_A_Generalist_Multi-Agent_System_for_Solving_Complex_Tasks\", \"content\": \"Magentic-One is a generalist multi-agen t system for autonomously completing complex tasks. The team's work is coordinated by an Orchestrator agen t, resp onsible for task decomposition\"}, {\"url\": \"https://pub.towardsai.net/building-a-multi-agent-system-to-accomplish-complex-tasks-812aeedda4eb\", \"content\": \"A simple framework for multi-agent systems allowing specialized agents to communicate and collaborate for multi-step tasks. ... leverage the strengths of individual agents, each with specialized roles and capabilities, to collaboratively achieve complex tasks. This article delves into the potential of LLM agents to drive the next wave of\"}, {\"url\": \"https://www.microsoft.com/en-us/research/articles/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/\", \"content\": \"Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks - Microsoft Research We are also releasing an open-source implementation of Magentic-One (opens in new tab) on Microsoft AutoGen, our popular open-source framework for developing multi-agent applications. We see valuable new directions in agentic, safety and Responsible AI research: In terms of anticipating new risks from agentic systems, it is possible that agents will be subject to the same phishing, social engineering, and misinformation attacks that target human web surfers when they are acting on the public web. To conclude, in this work we introduced Magentic-One, a generalist multi-agent system represents a significant development in agentic systems capable of solving open-ended tasks.\"}, {\"url\": \"https://arxiv.org/abs/2411.04468\", \"content\": \"[2411.04468] Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks Title:Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks View a PDF of the paper titled Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks, by Adam Fourney and 18 other authors In this work, we introduce Magentic-One, a high-performing open-source agentic system for solving such tasks. We provide an open-source implementation of Magentic-One, and we include AutoGenBench, a standalone tool for agentic evaluation. View a PDF of the paper titled Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks, by Adam Fourney and 18 other authors Bibliographic Explorer Toggle Connected Papers Toggle Litmaps Toggle scite.ai Toggle alphaXiv Toggle Links to Code Toggle DagsHub Toggle GotitPub Toggle Huggingface Toggle Links to Code Toggle ScienceCast Toggle Replicate Toggle\"}]\n",
"==================================================\n",
"\n",
"==================================================\n",
@@ -2076,37 +1503,87 @@
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"\n",
- "# Report on Multi-Agent Architecture for Complex Task Performance\n",
+ "# Report on Multi-Agent Architecture for Complex Task Execution\n",
+ "\n",
+ "## Outline\n",
+ "\n",
+ "1. **Introduction**\n",
+ " - Definition of multi-agent systems (MAS) and their significance in solving complex tasks.\n",
+ " - Overview of the evolution of MAS and their applications in various fields.\n",
+ " - Importance of collaboration among agents in achieving task objectives.\n",
+ " - Brief mention of the structure of the report and what each section will cover.\n",
+ " - Statement of the report's objectives and the relevance of the topic in current research.\n",
+ "\n",
+ "2. **Background**\n",
+ " - Historical context of multi-agent systems and their development.\n",
+ " - Key concepts in MAS, including agent autonomy, communication, and cooperation.\n",
+ " - Overview of different types of agents and their roles in MAS.\n",
+ " - Discussion of the theoretical frameworks that underpin MAS, such as game theory and distributed systems.\n",
+ " - Summary of existing literature and research on MAS applications.\n",
+ "\n",
+ "3. **Methodology**\n",
+ " - Description of the design and implementation of a multi-agent architecture.\n",
+ " - Explanation of task decomposition and agent specialization.\n",
+ " - Overview of communication protocols and mechanisms used in MAS.\n",
+ " - Discussion of evaluation metrics for assessing the performance of MAS.\n",
+ " - Case studies illustrating the application of the methodology in real-world scenarios.\n",
+ "\n",
+ "4. **Applications**\n",
+ " - Exploration of various domains where MAS can be applied, such as robotics, healthcare, and smart cities.\n",
+ " - Detailed examples of successful MAS implementations in industry and research.\n",
+ " - Discussion of how MAS can enhance efficiency and effectiveness in complex task execution.\n",
+ " - Analysis of the role of MAS in emerging technologies, such as AI and IoT.\n",
+ " - Future trends and potential areas for further research in MAS applications.\n",
+ "\n",
+ "5. **Challenges**\n",
+ " - Identification of common challenges faced in the development and deployment of MAS.\n",
+ " - Discussion of issues related to agent coordination, communication, and conflict resolution.\n",
+ " - Examination of ethical considerations and safety concerns in MAS.\n",
+ " - Overview of technical limitations and scalability issues.\n",
+ " - Strategies for overcoming these challenges and improving MAS performance.\n",
"\n",
- "## Abstract\n",
- "This report explores the utilization of multi-agent architecture in performing complex tasks. It provides a comprehensive overview of the theoretical foundations, methodologies, applications, and challenges associated with multi-agent systems (MAS). The report also includes case studies and examples that illustrate the practical applications of MAS in real-world scenarios. By examining the capabilities and limitations of multi-agent systems, this report aims to contribute to the understanding of how these systems can be effectively employed to solve intricate problems across various domains.\n",
+ "6. **Conclusions**\n",
+ " - Summary of key findings from the report.\n",
+ " - Reflection on the significance of multi-agent architecture in solving complex tasks.\n",
+ " - Recommendations for future research directions in MAS.\n",
+ " - Final thoughts on the potential impact of MAS on society and technology.\n",
+ " - Call to action for researchers and practitioners to explore MAS further.\n",
"\n",
- "## 1. Introduction\n",
- "Multi-agent architecture refers to a system composed of multiple interacting intelligent agents, each capable of autonomous decision-making. The increasing complexity of tasks in various fields, such as robotics, healthcare, and finance, necessitates the development of sophisticated systems that can operate collaboratively. This report aims to elucidate the principles of multi-agent systems and their application in addressing complex tasks. The significance of this study lies in its potential to enhance efficiency, adaptability, and problem-solving capabilities in dynamic environments. By leveraging the strengths of individual agents, multi-agent systems can achieve outcomes that surpass the capabilities of single-agent systems.\n",
+ "## Detailed Content\n",
"\n",
- "## 2. Background\n",
- "The concept of multi-agent systems has its roots in artificial intelligence and distributed computing. Agents in a multi-agent system can be defined as autonomous entities that perceive their environment and act upon it to achieve specific goals. The interactions among agents can be cooperative, competitive, or a mix of both, depending on the task at hand. Key theories underpinning MAS include game theory, distributed problem-solving, and social choice theory. These theories provide insights into how agents can negotiate, share information, and coordinate their actions to optimize performance. The evolution of MAS has been driven by advancements in machine learning, communication protocols, and computational power, enabling the development of more sophisticated and capable agents.\n",
+ "### 1. Introduction\n",
"\n",
- "## 3. Methodology\n",
- "The methodology for implementing a multi-agent architecture involves several key steps. First, the task must be decomposed into manageable subtasks that can be assigned to individual agents. This decomposition is often guided by the capabilities and expertise of the agents involved. Next, agents must be designed to communicate effectively, sharing information and coordinating their actions to achieve the overall goal. The use of protocols such as Contract Net Protocol (CNP) and Auction-based mechanisms can facilitate this communication. Additionally, agents must be equipped with learning algorithms that allow them to adapt to changing environments and improve their performance over time. Finally, the system must be evaluated to assess its effectiveness in solving the complex task, often using metrics such as completion time, resource utilization, and accuracy.\n",
+ "Multi-agent systems (MAS) are defined as systems composed of multiple interacting intelligent agents, capable of autonomous decision-making and task execution. The significance of MAS lies in their ability to collaboratively solve complex tasks that are beyond the capabilities of individual agents. Over the years, MAS have evolved from simple rule-based systems to sophisticated architectures that leverage advanced algorithms and machine learning techniques. The collaboration among agents is crucial, as it allows for the distribution of tasks, parallel processing, and the pooling of resources and knowledge. This report aims to provide a comprehensive overview of multi-agent architecture, focusing on its methodology, applications, challenges, and future directions.\n",
"\n",
- "## 4. Applications\n",
- "Multi-agent systems have a wide range of applications across various domains. In robotics, MAS can be employed for tasks such as coordinated exploration, where multiple robots work together to map an unknown environment. In healthcare, agents can assist in patient management by coordinating care among different providers and ensuring that patients receive timely interventions. Financial markets also benefit from MAS, where agents can analyze market data and execute trades based on collective intelligence. Furthermore, in smart cities, multi-agent systems can optimize traffic flow and energy consumption by coordinating the actions of various infrastructure components. Each of these applications demonstrates the versatility and effectiveness of multi-agent architecture in addressing complex challenges.\n",
+ "### 2. Background\n",
"\n",
- "## 5. Challenges\n",
- "Despite the advantages of multi-agent systems, several challenges must be addressed to ensure their successful implementation. One significant challenge is the issue of agent coordination, particularly in dynamic environments where tasks and conditions may change rapidly. Ensuring that agents can adapt to these changes while maintaining effective communication is crucial. Additionally, the complexity of designing agents that can operate autonomously while adhering to ethical guidelines poses another challenge. Security concerns, such as the potential for agents to be manipulated or compromised, must also be considered. Finally, the integration of multi-agent systems with existing technologies and infrastructures can present technical hurdles that need to be overcome.\n",
+ "The historical context of multi-agent systems dates back to the early days of artificial intelligence, where researchers began exploring the potential of autonomous agents. Key concepts in MAS include agent autonomy, which refers to the ability of agents to operate independently, and communication, which is essential for coordination among agents. Different types of agents, such as reactive, deliberative, and hybrid agents, play distinct roles in MAS, contributing to their overall functionality. Theoretical frameworks, including game theory and distributed systems, provide the foundation for understanding agent interactions and decision-making processes. A review of existing literature reveals a growing interest in MAS applications across various domains, highlighting their versatility and effectiveness.\n",
"\n",
- "## 6. Conclusions\n",
- "In conclusion, multi-agent architecture offers a powerful framework for addressing complex tasks across various domains. By leveraging the capabilities of multiple autonomous agents, systems can achieve greater efficiency, adaptability, and problem-solving capabilities. However, the successful implementation of MAS requires careful consideration of coordination, ethical guidelines, security, and integration challenges. Future research should focus on developing robust frameworks and methodologies that enhance the effectiveness of multi-agent systems in real-world applications. As technology continues to evolve, the potential for multi-agent systems to revolutionize complex task performance remains significant.\n",
+ "### 3. Methodology\n",
+ "\n",
+ "The design and implementation of a multi-agent architecture involve several key steps, including task decomposition, where complex tasks are broken down into manageable subtasks assigned to specialized agents. Communication protocols, such as publish-subscribe mechanisms, facilitate information exchange among agents, ensuring that they remain informed about relevant developments. Evaluation metrics, such as task completion time and resource utilization, are essential for assessing the performance of MAS. Case studies, such as the deployment of MAS in disaster response scenarios, illustrate the practical application of these methodologies, showcasing how agents can work together to achieve common goals.\n",
+ "\n",
+ "### 4. Applications\n",
+ "\n",
+ "Multi-agent systems have found applications in diverse fields, including robotics, where they enable coordinated movements of robotic swarms, and healthcare, where they assist in patient monitoring and treatment planning. Successful implementations, such as autonomous vehicles and smart grid management, demonstrate the potential of MAS to enhance efficiency and effectiveness in complex task execution. The integration of MAS with emerging technologies, such as the Internet of Things (IoT) and artificial intelligence (AI), opens new avenues for innovation and problem-solving. Future trends indicate a growing reliance on MAS in various sectors, driven by the need for intelligent and adaptive systems.\n",
+ "\n",
+ "### 5. Challenges\n",
+ "\n",
+ "Despite their advantages, the development and deployment of multi-agent systems face several challenges. Coordination among agents can be difficult, especially in dynamic environments where tasks and conditions change rapidly. Communication issues, such as information overload and misinterpretation, can hinder agent collaboration. Ethical considerations, including privacy and security concerns, must be addressed to ensure responsible use of MAS. Technical limitations, such as scalability and computational complexity, pose additional hurdles. Strategies for overcoming these challenges include the development of robust algorithms, improved communication protocols, and ethical guidelines for agent behavior.\n",
+ "\n",
+ "### 6. Conclusions\n",
+ "\n",
+ "In conclusion, multi-agent architecture represents a powerful approach to solving complex tasks through collaboration and autonomy. The findings of this report highlight the significance of MAS in various applications and the potential for future research to address existing challenges. As technology continues to evolve, the impact of MAS on society and industry will likely grow, necessitating further exploration and innovation in this field. Researchers and practitioners are encouraged to delve deeper into the capabilities of MAS, fostering advancements that can lead to more efficient and effective solutions to real-world problems.\n",
"\n",
"## References\n",
- "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. Microsoft Research. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
- "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. arXiv. Retrieved from [arXiv](https://arxiv.org/abs/2411.04468)\n",
- "- Microsoft Tech Community. (2023). Ragentic: RAG Enhanced Multi-Agent Architecture. Retrieved from [Microsoft Tech Community](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132)\n",
+ "\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. *Microsoft Research*. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/articles/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
+ "- Building a Multi-Agent System to Accomplish Complex Tasks. (2023). *Towards AI*. Retrieved from [Towards AI](https://pub.towardsai.net/building-a-multi-agent-system-to-accomplish-complex-tasks-812aeedda4eb)\n",
+ "- Architectures for AI Agents. (2023). *DragonScale AI Blog*. Retrieved from [DragonScale AI](https://blog.dragonscale.ai/architectures-for-ai-agents/)\n",
"\n",
"---\n",
"\n",
- "The above content is structured in a markdown format suitable for an Arxiv paper. The sections are fully populated with detailed explanations, and references are provided in APA format. Diagrams and charts can be created separately to illustrate complex ideas, such as agent relationships and task processes, as needed. The final document can be saved as a .md file.\n",
+ "The final document will be saved as a `.md` file with all the content fully populated, including the references section in APA format at the end.\n",
"==================================================\n",
"\n",
"==================================================\n",
@@ -2115,37 +1592,87 @@
"================================\u001b[1m Human Message \u001b[0m=================================\n",
"Name: Searcher\n",
"\n",
- "# Report on Multi-Agent Architecture for Complex Task Performance\n",
+ "# Report on Multi-Agent Architecture for Complex Task Execution\n",
"\n",
- "## Abstract\n",
- "This report explores the utilization of multi-agent architecture in performing complex tasks. It provides a comprehensive overview of the theoretical foundations, methodologies, applications, and challenges associated with multi-agent systems (MAS). The report also includes case studies and examples that illustrate the practical applications of MAS in real-world scenarios. By examining the capabilities and limitations of multi-agent systems, this report aims to contribute to the understanding of how these systems can be effectively employed to solve intricate problems across various domains.\n",
+ "## Outline\n",
"\n",
- "## 1. Introduction\n",
- "Multi-agent architecture refers to a system composed of multiple interacting intelligent agents, each capable of autonomous decision-making. The increasing complexity of tasks in various fields, such as robotics, healthcare, and finance, necessitates the development of sophisticated systems that can operate collaboratively. This report aims to elucidate the principles of multi-agent systems and their application in addressing complex tasks. The significance of this study lies in its potential to enhance efficiency, adaptability, and problem-solving capabilities in dynamic environments. By leveraging the strengths of individual agents, multi-agent systems can achieve outcomes that surpass the capabilities of single-agent systems.\n",
+ "1. **Introduction**\n",
+ " - Definition of multi-agent systems (MAS) and their significance in solving complex tasks.\n",
+ " - Overview of the evolution of MAS and their applications in various fields.\n",
+ " - Importance of collaboration among agents in achieving task objectives.\n",
+ " - Brief mention of the structure of the report and what each section will cover.\n",
+ " - Statement of the report's objectives and the relevance of the topic in current research.\n",
"\n",
- "## 2. Background\n",
- "The concept of multi-agent systems has its roots in artificial intelligence and distributed computing. Agents in a multi-agent system can be defined as autonomous entities that perceive their environment and act upon it to achieve specific goals. The interactions among agents can be cooperative, competitive, or a mix of both, depending on the task at hand. Key theories underpinning MAS include game theory, distributed problem-solving, and social choice theory. These theories provide insights into how agents can negotiate, share information, and coordinate their actions to optimize performance. The evolution of MAS has been driven by advancements in machine learning, communication protocols, and computational power, enabling the development of more sophisticated and capable agents.\n",
+ "2. **Background**\n",
+ " - Historical context of multi-agent systems and their development.\n",
+ " - Key concepts in MAS, including agent autonomy, communication, and cooperation.\n",
+ " - Overview of different types of agents and their roles in MAS.\n",
+ " - Discussion of the theoretical frameworks that underpin MAS, such as game theory and distributed systems.\n",
+ " - Summary of existing literature and research on MAS applications.\n",
"\n",
- "## 3. Methodology\n",
- "The methodology for implementing a multi-agent architecture involves several key steps. First, the task must be decomposed into manageable subtasks that can be assigned to individual agents. This decomposition is often guided by the capabilities and expertise of the agents involved. Next, agents must be designed to communicate effectively, sharing information and coordinating their actions to achieve the overall goal. The use of protocols such as Contract Net Protocol (CNP) and Auction-based mechanisms can facilitate this communication. Additionally, agents must be equipped with learning algorithms that allow them to adapt to changing environments and improve their performance over time. Finally, the system must be evaluated to assess its effectiveness in solving the complex task, often using metrics such as completion time, resource utilization, and accuracy.\n",
+ "3. **Methodology**\n",
+ " - Description of the design and implementation of a multi-agent architecture.\n",
+ " - Explanation of task decomposition and agent specialization.\n",
+ " - Overview of communication protocols and mechanisms used in MAS.\n",
+ " - Discussion of evaluation metrics for assessing the performance of MAS.\n",
+ " - Case studies illustrating the application of the methodology in real-world scenarios.\n",
"\n",
- "## 4. Applications\n",
- "Multi-agent systems have a wide range of applications across various domains. In robotics, MAS can be employed for tasks such as coordinated exploration, where multiple robots work together to map an unknown environment. In healthcare, agents can assist in patient management by coordinating care among different providers and ensuring that patients receive timely interventions. Financial markets also benefit from MAS, where agents can analyze market data and execute trades based on collective intelligence. Furthermore, in smart cities, multi-agent systems can optimize traffic flow and energy consumption by coordinating the actions of various infrastructure components. Each of these applications demonstrates the versatility and effectiveness of multi-agent architecture in addressing complex challenges.\n",
+ "4. **Applications**\n",
+ " - Exploration of various domains where MAS can be applied, such as robotics, healthcare, and smart cities.\n",
+ " - Detailed examples of successful MAS implementations in industry and research.\n",
+ " - Discussion of how MAS can enhance efficiency and effectiveness in complex task execution.\n",
+ " - Analysis of the role of MAS in emerging technologies, such as AI and IoT.\n",
+ " - Future trends and potential areas for further research in MAS applications.\n",
"\n",
- "## 5. Challenges\n",
- "Despite the advantages of multi-agent systems, several challenges must be addressed to ensure their successful implementation. One significant challenge is the issue of agent coordination, particularly in dynamic environments where tasks and conditions may change rapidly. Ensuring that agents can adapt to these changes while maintaining effective communication is crucial. Additionally, the complexity of designing agents that can operate autonomously while adhering to ethical guidelines poses another challenge. Security concerns, such as the potential for agents to be manipulated or compromised, must also be considered. Finally, the integration of multi-agent systems with existing technologies and infrastructures can present technical hurdles that need to be overcome.\n",
+ "5. **Challenges**\n",
+ " - Identification of common challenges faced in the development and deployment of MAS.\n",
+ " - Discussion of issues related to agent coordination, communication, and conflict resolution.\n",
+ " - Examination of ethical considerations and safety concerns in MAS.\n",
+ " - Overview of technical limitations and scalability issues.\n",
+ " - Strategies for overcoming these challenges and improving MAS performance.\n",
"\n",
- "## 6. Conclusions\n",
- "In conclusion, multi-agent architecture offers a powerful framework for addressing complex tasks across various domains. By leveraging the capabilities of multiple autonomous agents, systems can achieve greater efficiency, adaptability, and problem-solving capabilities. However, the successful implementation of MAS requires careful consideration of coordination, ethical guidelines, security, and integration challenges. Future research should focus on developing robust frameworks and methodologies that enhance the effectiveness of multi-agent systems in real-world applications. As technology continues to evolve, the potential for multi-agent systems to revolutionize complex task performance remains significant.\n",
+ "6. **Conclusions**\n",
+ " - Summary of key findings from the report.\n",
+ " - Reflection on the significance of multi-agent architecture in solving complex tasks.\n",
+ " - Recommendations for future research directions in MAS.\n",
+ " - Final thoughts on the potential impact of MAS on society and technology.\n",
+ " - Call to action for researchers and practitioners to explore MAS further.\n",
+ "\n",
+ "## Detailed Content\n",
+ "\n",
+ "### 1. Introduction\n",
+ "\n",
+ "Multi-agent systems (MAS) are defined as systems composed of multiple interacting intelligent agents, capable of autonomous decision-making and task execution. The significance of MAS lies in their ability to collaboratively solve complex tasks that are beyond the capabilities of individual agents. Over the years, MAS have evolved from simple rule-based systems to sophisticated architectures that leverage advanced algorithms and machine learning techniques. The collaboration among agents is crucial, as it allows for the distribution of tasks, parallel processing, and the pooling of resources and knowledge. This report aims to provide a comprehensive overview of multi-agent architecture, focusing on its methodology, applications, challenges, and future directions.\n",
+ "\n",
+ "### 2. Background\n",
+ "\n",
+ "The historical context of multi-agent systems dates back to the early days of artificial intelligence, where researchers began exploring the potential of autonomous agents. Key concepts in MAS include agent autonomy, which refers to the ability of agents to operate independently, and communication, which is essential for coordination among agents. Different types of agents, such as reactive, deliberative, and hybrid agents, play distinct roles in MAS, contributing to their overall functionality. Theoretical frameworks, including game theory and distributed systems, provide the foundation for understanding agent interactions and decision-making processes. A review of existing literature reveals a growing interest in MAS applications across various domains, highlighting their versatility and effectiveness.\n",
+ "\n",
+ "### 3. Methodology\n",
+ "\n",
+ "The design and implementation of a multi-agent architecture involve several key steps, including task decomposition, where complex tasks are broken down into manageable subtasks assigned to specialized agents. Communication protocols, such as publish-subscribe mechanisms, facilitate information exchange among agents, ensuring that they remain informed about relevant developments. Evaluation metrics, such as task completion time and resource utilization, are essential for assessing the performance of MAS. Case studies, such as the deployment of MAS in disaster response scenarios, illustrate the practical application of these methodologies, showcasing how agents can work together to achieve common goals.\n",
+ "\n",
+ "### 4. Applications\n",
+ "\n",
+ "Multi-agent systems have found applications in diverse fields, including robotics, where they enable coordinated movements of robotic swarms, and healthcare, where they assist in patient monitoring and treatment planning. Successful implementations, such as autonomous vehicles and smart grid management, demonstrate the potential of MAS to enhance efficiency and effectiveness in complex task execution. The integration of MAS with emerging technologies, such as the Internet of Things (IoT) and artificial intelligence (AI), opens new avenues for innovation and problem-solving. Future trends indicate a growing reliance on MAS in various sectors, driven by the need for intelligent and adaptive systems.\n",
+ "\n",
+ "### 5. Challenges\n",
+ "\n",
+ "Despite their advantages, the development and deployment of multi-agent systems face several challenges. Coordination among agents can be difficult, especially in dynamic environments where tasks and conditions change rapidly. Communication issues, such as information overload and misinterpretation, can hinder agent collaboration. Ethical considerations, including privacy and security concerns, must be addressed to ensure responsible use of MAS. Technical limitations, such as scalability and computational complexity, pose additional hurdles. Strategies for overcoming these challenges include the development of robust algorithms, improved communication protocols, and ethical guidelines for agent behavior.\n",
+ "\n",
+ "### 6. Conclusions\n",
+ "\n",
+ "In conclusion, multi-agent architecture represents a powerful approach to solving complex tasks through collaboration and autonomy. The findings of this report highlight the significance of MAS in various applications and the potential for future research to address existing challenges. As technology continues to evolve, the impact of MAS on society and industry will likely grow, necessitating further exploration and innovation in this field. Researchers and practitioners are encouraged to delve deeper into the capabilities of MAS, fostering advancements that can lead to more efficient and effective solutions to real-world problems.\n",
"\n",
"## References\n",
- "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. Microsoft Research. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
- "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. arXiv. Retrieved from [arXiv](https://arxiv.org/abs/2411.04468)\n",
- "- Microsoft Tech Community. (2023). Ragentic: RAG Enhanced Multi-Agent Architecture. Retrieved from [Microsoft Tech Community](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132)\n",
+ "\n",
+ "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. *Microsoft Research*. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/articles/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
+ "- Building a Multi-Agent System to Accomplish Complex Tasks. (2023). *Towards AI*. Retrieved from [Towards AI](https://pub.towardsai.net/building-a-multi-agent-system-to-accomplish-complex-tasks-812aeedda4eb)\n",
+ "- Architectures for AI Agents. (2023). *DragonScale AI Blog*. Retrieved from [DragonScale AI](https://blog.dragonscale.ai/architectures-for-ai-agents/)\n",
"\n",
"---\n",
"\n",
- "The above content is structured in a markdown format suitable for an Arxiv paper. The sections are fully populated with detailed explanations, and references are provided in APA format. Diagrams and charts can be created separately to illustrate complex ideas, such as agent relationships and task processes, as needed. The final document can be saved as a .md file.\n",
+ "The final document will be saved as a `.md` file with all the content fully populated, including the references section in APA format at the end.\n",
"==================================================\n",
"\n",
"==================================================\n",
@@ -2160,10 +1687,10 @@
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"Tool Calls:\n",
- " scrape_webpages (call_HgRTXdKWxLszfiZCSYaFqjQg)\n",
- " Call ID: call_HgRTXdKWxLszfiZCSYaFqjQg\n",
+ " scrape_webpages (call_v6TogFP9sqTAwkZCNvKFLdaf)\n",
+ " Call ID: call_v6TogFP9sqTAwkZCNvKFLdaf\n",
" Args:\n",
- " urls: ['https://arxiv.org/', 'https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/', 'https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132']\n",
+ " urls: ['https://www.microsoft.com/en-us/research/articles/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/', 'https://pub.towardsai.net/building-a-multi-agent-system-to-accomplish-complex-tasks-812aeedda4eb', 'https://blog.dragonscale.ai/architectures-for-ai-agents/']\n",
"==================================================\n",
"\n",
"==================================================\n",
@@ -2172,1835 +1699,391 @@
"=================================\u001b[1m Tool Message \u001b[0m=================================\n",
"Name: scrape_webpages\n",
"\n",
- "\n",
+ "\n",
+ "Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks - Microsoft Research Skip to main content Microsoft Research Research Research Home Our research Resources Resources Publications Code & data People Microsoft Research blog Research areas: Intelligence Research areas: Intelligence Artificial intelligence Audio & acoustics Computer vision Graphics & multimedia Human-computer interaction Human language technologies Search & information retrieval Research areas: Systems Research areas: Systems Data platforms and analytics Hardware & devices Programming languages & software engineering Quantum computing Security, privacy & cryptography Systems & networking Research areas: Theory Research areas: Theory Algorithms Mathematics Research areas: Other Sciences Research areas: Other Sciences Ecology & environment Economics Medical, health & genomics Social sciences Technology for emerging markets Programs & events Academic programs Events & academic conferences Microsoft Research Forum Connect & learn Behind the Tech podcast Microsoft Research blog Microsoft Research Forum Microsoft Research podcast About People & news People & news About Microsoft Research Careers & internships People Emeritus program News & awards Microsoft Research newsletter Microsoft Research Labs Microsoft Research Labs Africa AI for Science AI Frontiers Asia-Pacific Cambridge Health Futures India Montreal New England New York City Redmond Other labs Other labs Applied Sciences Mixed Reality & AI - Cambridge Mixed Reality & AI - Zurich More Register: Research Forum All Microsoft Global Microsoft Security Azure Dynamics 365 Microsoft 365 Microsoft Teams Windows 365 Tech & innovation Tech & innovation Microsoft Cloud AI Azure Space Mixed reality Microsoft HoloLens Microsoft Viva Quantum computing Sustainability Industries Industries Education Automotive Financial services Government Healthcare Manufacturing Retail All industries Partners Partners Find a partner Become a partner Partner Network Azure Marketplace AppSource Resources Resources Blog Microsoft Advertising Developer Center Documentation Events Licensing Microsoft Learn Microsoft Research View Sitemap Search Search Microsoft Research No results Cancel AI Frontiers AI Frontiers AI Frontiers blog Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks Published November 4, 2024 | Updated November 12, 2024 Share this page Share on Facebook Share on X Share on LinkedIn Share on Reddit Subscribe to our RSS feed By Adam Fourney, Principal Researcher; Gagan Bansal, Senior Researcher; Hussein Mozannar, Senior Researcher; Victor Dibia, Principal Research Software Engineer; Saleema Amershi, Partner Research Manager Contributors: Adam Fourney, Gagan Bansal, Hussein Mozannar, Cheng Tan, Eduardo Salinas, Erkang (Eric) Zhu, Friederike Niedtner, Grace Proebsting, Griffin Bassman, Jack Gerrits, Jacob Alber, Peter Chang, Ricky Loynd, Robert West, Victor Dibia, Ahmed Awadallah, Ece Kamar, Rafah Hosn, Saleema Amershi We are introducing Magentic-One, our new generalist multi-agent system for solving open-ended web and file-based tasks across a variety of domains. Magentic-One represents a significant step towards developing agents that can complete tasks that people encounter in their work and personal lives. We are also releasing an open-source implementation of Magentic-One (opens in new tab) on Microsoft AutoGen, our popular open-source framework for developing multi-agent applications. The future of AI is agentic. AI systems are evolving from having conversations to getting things done—this is where we expect much of AI’s value to shine. It’s the difference between generative AI recommending dinner options to agentic assistants that can autonomously place your order and arrange delivery. It’s the shift from summarizing research papers to actively searching for and organizing relevant studies in a comprehensive literature review. Modern AI agents, capable of perceiving, reasoning, and acting on our behalf, are demonstrating remarkable performance in areas such as software engineering, data analysis, scientific research, and web navigation. Still, to fully realize the long-held vision of agentic systems that can enhance our productivity and transform our lives, we need advances in generalist agentic systems. These systems must reliably complete complex, multi-step tasks across a wide range of scenarios people encounter in their daily lives. Introducing Magentic-One (opens in new tab), a high-performing generalist agentic system designed to solve such tasks. Magentic-One employs a multi-agent architecture where a lead agent, the Orchestrator, directs four other agents to solve tasks. The Orchestrator plans, tracks progress, and re-plans to recover from errors, while directing specialized agents to perform tasks like operating a web browser, navigating local files, or writing and executing Python code. Magentic-One achieves statistically competitive performance to the state-of-the-art on multiple challenging agentic benchmarks, without requiring modifications to its core capabilities or architecture. Built on AutoGen (opens in new tab), our popular open-source multi-agent framework, Magentic-One’s modular, multi-agent design offers numerous advantages over monolithic single-agent systems. By encapsulating distinct skills in separate agents, it simplifies development and reuse, similar to object-oriented programming. Magentic-One’s plug-and-play design further supports easy adaptation and extensibility by enabling agents to be added or removed without needing to rework the entire system—unlike single-agent systems, which often struggle with inflexible workflows. We’re making Magentic-One open-source (opens in new tab) for researchers and developers. While Magentic-One shows strong generalist capabilities, it’s still far from human-level performance and can make mistakes. Moreover, as agentic systems grow more powerful, their risks—like taking undesirable actions or enabling malicious use-cases—can also increase. While we’re still in the early days of modern agentic AI, we’re inviting the community to help tackle these open challenges and ensure our future agentic systems are both helpful and safe. To this end, we’re also releasing AutoGenBench (opens in new tab), an agentic evaluation tool with built-in controls for repetition and isolation to rigorously test agentic benchmarks and tasks while minimizing undesirable side-effects. Code on GitHub Read the technical report How it works Magentic-One features an Orchestrator agent that implements two loops: an outer loop and an inner loop. The outer loop (lighter background with solid arrows) manages the task ledger (containing facts, guesses, and plan) and the inner loop (darker background with dotted arrows) manages the progress ledger (containing current progress, task assignment to agents). Magentic-One work is based on a multi-agent architecture where a lead Orchestrator agent is responsible for high-level planning, directing other agents and tracking task progress. The Orchestrator begins by creating a plan to tackle the task, gathering needed facts and educated guesses in a Task Ledger that is maintained. At each step of its plan, the Orchestrator creates a Progress Ledger where it self-reflects on task progress and checks whether the task is completed. If the task is not yet completed, it assigns one of Magentic-One other agents a subtask to complete. After the assigned agent completes its subtask, the Orchestrator updates the Progress Ledger and continues in this way until the task is complete. If the Orchestrator finds that progress is not being made for enough steps, it can update the Task Ledger and create a new plan. This is illustrated in the figure above; the Orchestrator work is thus divided into an outer loop where it updates the Task Ledger and an inner loop to update the Progress Ledger. Magentic-One consists of the following agents: Orchestrator: The lead agent responsible for task decomposition, planning, directing other agents in executing subtasks, tracking overall progress, and taking corrective actions as needed WebSurfer: An LLM-based agent proficient in commanding and managing the state of a Chromium-based web browser. For each request, the WebSurfer performs actions such as navigation (e.g., visiting URLs, performing searches), interacting with webpages (e.g., clicking, typing), and reading actions (e.g., summarizing, answering questions). It then reports on the new state of the webpage. The WebSurfer relies on the browser’s accessibility tree and set-of-marks prompting to perform its tasks. FileSurfer: An LLM-based agent that commands a markdown-based file preview application to read local files. It can also perform common navigation tasks such as listing directory contents and navigating through them. Coder: An LLM-based agent specialized in writing code, analyzing information collected from the other agents, and creating new artifacts. ComputerTerminal: Provides access to a console shell for executing programs and installing new libraries. Together, Magentic-One’s agents equip the Orchestrator with the tools and capabilities it needs to solve a wide range of open-ended problems and autonomously adapt to, and act in, dynamic and ever-changing web and file-system environments. While the default multimodal LLM used for all agents is GPT-4o, Magentic-One is model-agnostic, allowing the integration of heterogeneous models to support different capabilities or meet different cost requirements. For example, different LLMs and SLMs or specialized versions can power different agents. For the Orchestrator, we recommend a strong reasoning model, like GPT-4o. In a different configuration, we also experimented with using OpenAI o1-preview for the Orchestrator’s outer loop and for the Coder, while other agents continued to use GPT-4o. Evaluation To rigorously evaluate Magentic-One’s performance, we introduce AutoGenBench, an open-source standalone tool for running agentic benchmarks that allows repetition and isolation, e.g., to control for variance of stochastic LLM calls and side-effects of agents taking actions in the world. AutoGenBench facilitates agentic evaluation and allows adding new benchmarks. Using AutoGenBench, we can evaluate Magentic-One on a variety of benchmarks. Our criterion for selecting benchmarks is that they should involve complex multi-step tasks, with at least some steps requiring planning and tool use, including using web browsers to act on real or simulated webpages. We consider three benchmarks in this work that satisfy this criterion: GAIA, AssistantBench, and WebArena. In the Figure below we show the performance of Magentic-One on the three benchmarks and compare with GPT-4 operating on its own and the per-benchmark highest-performing open-source baseline and non open-source benchmark specific baseline according to the public leaderboards as of October 21, 2024. Magentic-One (GPT-4o, o1) achieves statistically comparable performance to previous SOTA methods on both GAIA and AssistantBench and competitive performance on WebArena. Note that GAIA and AssistantBench have a hidden test set while WebArena does not, and thus WebArena results are self-reported. Together, these results establish Magentic-One as a strong generalist agentic system for completing complex tasks. Evaluation results of Magentic-One on the GAIA, AssistantBench and WebArena. Error bars indicate 95% confidence intervals. Note that WebArena results are self-reported. Risks and mitigations Agentic systems like Magentic-One mark a significant shift in both the opportunities and risks associated with AI. Magentic-One interacts with a digital world designed for humans, taking actions that can change states and potentially lead to irreversible consequences. These inherent and undeniable risks were evident during our testing, where several emerging issues surfaced. For example, during development, a misconfiguration led agents to repeatedly attempt and fail to log into a WebArena website. This resulted in the account being temporarily suspended. The agents then tried to reset the account’s password. Even more concerning were cases in which agents, until explicitly stopped, attempted to recruit human assistance by posting on social media, emailing textbook authors, or even drafting a freedom of information request to a government entity. In each case, the agents were unsuccessful due to a lack of the required tools or accounts, or because human observers intervened. Aligned with the Microsoft AI principles and Responsible AI practices, we worked to identify, measure, and mitigate potential risks before deploying Magentic-One. Specifically, we conducted red-teaming exercises to assess risks related to harmful content, jailbreaks, and prompt injection attacks, finding no increased risk from our design. Additionally, we provide cautionary notices and guidance for using Magentic-One safely, including examples and appropriate default settings. Users are advised to keep humans in the loop for monitoring, and ensure that all code execution examples, evaluations, and benchmarking tools are run in sandboxed Docker containers to minimize risks. Recommendations and looking forward We recommend using Magentic-One with models that have strong alignment, pre- and post-generation filtering, and closely monitored logs during and after execution. In our own use, we follow the principles of least privilege and maximum oversight. Minimizing risks associated with agentic AI will require new ideas and extensive research, as much work is still needed to understand these emerging risks and develop effective mitigations. We are committed to sharing our learnings with the community and evolving Magentic-One in line with the latest safety research. As we look ahead, there are valuable opportunities to improve agentic AI, particularly in safety and Responsible AI research. Agents acting on the public web may be vulnerable to phishing, social engineering, and misinformation threats, much like human users. To counter these risks, an important direction is to equip agents with the ability to assess the reversibility of their actions—distinguishing between those that are easily reversible, those that require effort, and those that are irreversible. Actions like deleting files, sending emails, or filing forms are often difficult or impossible to undo. Systems should therefore be designed to pause and seek human input before proceeding with such high-risk actions. We invite the community to collaborate with us in ensuring that future agentic systems are both helpful and safe. For further information, results and discussion, please see our technical report. (opens in new tab) Opens in a new tab Continue reading January 2, 2025 RD-Agent: An open-source solution for smarter R&D December 4, 2024 Towards industrial foundation models: Integrating large language models with industrial data intelligence December 3, 2024 Theoretical foundation of large language models: Microsoft Research Asia StarTrack Scholars 2025 enhancing the power of LLMs November 28, 2024 NeurIPS 2024 Papers from Microsoft Research Asia See all blog posts Research Areas Artificial intelligence Related labs AI Frontiers Follow us: Follow on X Like on Facebook Follow on LinkedIn Subscribe on Youtube Follow on Instagram Subscribe to our RSS feed Share this page: Share on X Share on Facebook Share on LinkedIn Share on Reddit What's new Surface Pro Surface Laptop Surface Laptop Studio 2 Surface Laptop Go 3 Microsoft Copilot AI in Windows Explore Microsoft products Windows 11 apps Microsoft Store Account profile Download Center Microsoft Store support Returns Order tracking Certified Refurbished Microsoft Store Promise Flexible Payments Education Microsoft in education Devices for education Microsoft Teams for Education Microsoft 365 Education How to buy for your school Educator training and development Deals for students and parents Azure for students Business Microsoft Cloud Microsoft Security Dynamics 365 Microsoft 365 Microsoft Power Platform Microsoft Teams Microsoft 365 Copilot Small Business Developer & IT Azure Microsoft Developer Documentation Microsoft Learn Microsoft Tech Community Azure Marketplace AppSource Visual Studio Company Careers About Microsoft Company news Privacy at Microsoft Investors Diversity and inclusion Accessibility Sustainability Your Privacy Choices Opt-Out Icon Your Privacy Choices Your Privacy Choices Opt-Out Icon Your Privacy Choices Consumer Health Privacy Sitemap Contact Microsoft Privacy Manage cookies Terms of use Trademarks Safety & eco Recycling About our ads © Microsoft 2025\n",
+ "\n",
"\n",
+ "\n",
+ "Building a Multi-Agent System to Accomplish Complex Tasks | by Najib Sharifi, Ph.D. | Towards AIOpen in appSign upSign inWriteSign upSign inBuilding a Multi-Agent System to Accomplish Complex TasksA simple framework for multi-agent systems allowing specialized agents to communicate and collaborate for multi-step tasks.Najib Sharifi, Ph.D.·FollowPublished inTowards AI·8 min read·May 24, 2024--ListenShareWhen ChatGPT first arrived, it was game-changing. Now, it is used by people in all sectors and lines of work. ChatGPT demonstrated the strength of these machine learning models that most of us thought was not possible anytime soon. Whilst these LLMs have become increasingly more powerful in their capabilities, however, a very exciting development with immense potential is the use of multi-agents systems. For example, Devine AI, the first autonomous AI software engineer, is based multi-agent framework.A Climate AnalystIf you ask chatgpt to write you an article about any topic, there are several problems which result in chatgpt not producing a good report such as no access to up-to-date data about the topic, which can lead to hallucinations. What if we break down this complex to individual tasks? Consider a climate change analyst writing a report on the latest environmental trends; he/she would need to do a number of tasks (I appreciate this may be a simplification of the role but this is just for a demonstration purpose):· Research to find out all the key data from reliable sources.· Analyse all the resultant data and extract key interpretations of the data.· Write a report explaining the findings.· The report would then get peer-reviewed to ensure the scientific report is accurate and the findings are supported by the data presented.What if we have specialized agents for each task? i.e. one agent is the researcher, and another agent behaves as the analyst to analyze the data found, another agent is the writer, and a 4th agent is the critic who will ensure that the article findings are supported by the data presented (but hopefully unlikely real scientists, it won’t ask you to reference their completely irrelevant work in your article). These systems leverage the strengths of individual agents, each with specialized roles and capabilities, to collaboratively achieve complex tasks. This article delves into the potential of LLM agents to drive the next wave of developments, demonstrating their capabilities through a practical example of building a multiagent system.Building a Multiagent System: Climate Change AnalystYou can build multiagent systems using frameworks like CrewAI, the work demonstrated in this article is nothing but a humble attempt at building a very simple framework for the multiagent system. How these agents communicate, remember (they have memory, short and long!) and are coordinated are crucial to their performance. The aim of this article is to set up a simple framework, by doing so, we can gain a deeper understanding of these systems rather than importing everything from a readymade library and treating it as a Blackbox. We are going to build a system that can write an article on the latest trends in climate change, as mentioned above. We will develop a team of specialized agents that can research, analyze, write a scientific report, and peer review that report.Image generated with copilate.Lets Dive in! Setting up the environment and importing key libraries. We need to provide the agent doing the research a tool. This tool will allow the agent to google search through an API, I will use the crewai library SerperDevTool to do this.import osimport openaifrom openai import OpenAIfrom crewai_tools import SerperDevToolos.environ[\"SERPER_API_KEY\"] = \"your serper api key goes here\"os.environ[\"OPENAI_API_KEY\"] = \"your open AI key goes here\"client = OpenAI(api_key=os.getenv(\"OPENAI_API_KEY\"))search_tool = SerperDevTool(api_key=os.getenv(\"SERPER_API_KEY\"))Defining the agent class, each agent will have a specialised role, goal and backstory which we will assign later. The agent is able to store the task queried as well as the corresponding output, enabling short term memory. It can also message all other agents, as well as read messages sent from other agents.class Agent: def __init__(self, name, role, backstory, goal, tools=None): self.name = name self.backstory = backstory self.goal = goal self.role = role self.memory = [] self.tools = tools if tools else [] self.message_box = [] # adding memory for the agent to store recent tasks and outputs def add_to_memory(self, entry): self.memory.append(entry) # sending messages to other agents def send_message(self, recipient, message): recipient.message_box.append((self.name, message)) # reading the messages sent from other agents before performing task # this is done by removing messages from message box and added to memory def read_messages(self): while self.message_box: sender, message = self.message_box.pop(0) self.add_to_memory(f\"message from the {sender}: {message}\") # we now define the function that will do the task assigned # reading messages and adding task to the memory first # the agent will take up the specialised role assigned and querry gpt3.5 def do_task(self, task, inputs): self.read_messages() task_info = task.info self.add_to_memory(f\"doing task: {task_info}\") '''for the research agent, the search_tool will be assigned to the agent which it will be able to use to do a google search online''' if 'search_tool' in self.tools: search_query = task_info search_results = search_tool.run(query=search_query) inputs['search_results'] = search_results task_info += f\"\\n\\nsearch results:\\n{search_results}\" llm_response = client.chat.completions.create( model=\"gpt-3.5-turbo\", messages=[ {\"role\": \"system\", \"content\": f\"you are a {self.role}. {self.backstory} Your goal is {self.goal}.\"}, {\"role\": \"user\", \"content\": task_info} ] ) output = llm_response.choices[0].message.content self.add_to_memory(f\"task output: {output}\") return outputThe Architect, whose purpose is to (besides creating the matrix of course) assign the tasks to the corresponding agents and coordinate the flow of information between the agents. In this framework, besides the messaging between agents, the work is in a sequential form i.e. work from each agent is passed to the next one, there is no delegation or iteration of the tasks for the agents. However, in crewai framework, I think it has these properties which make it very powerful in its capabilities.class TheArchitect: def __init__(self, agents, tasks): # dictionary of all agents based on name self.agents = {agent.name: agent for agent in agents} self.tasks = tasks def process(self, inputs): results = {} current_result = None for task in self.tasks: task_agent = self.agents[task.agent.name] '''to help with debugging and also checking flow of info we can check/print which tasks are assigned to which agent''' print(f\"assignin task {task.name} to agent {task_agent.name}: {task.info}\") if current_result: inputs['previous_result'] = current_result if 'search' in inputs: search_query = inputs['search'] search_results = search_tool.run(query=search_query) inputs['search_results'] = search_results agent_output = task_agent.do_task(task, inputs) current_result = agent_output # send the agent's output as a message to all other agents for agent_name, agent in self.agents.items(): if agent_name != task_agent.name: task_agent.send_message(agent, agent_output) results[task.agent.name] = agent_output return resultsNow that we have defined the agents class and the architect class, lets create instances of these classes to define different agents with different roles.We can now define all the agents and give them names, roles, goals, backstories and tools. These agents only data collectors, data scientists, and report writers. In this case, we only have the researcher/data collector with a tool.data_collector = Agent( name=\"researcher\", role=\"Climate Data Collector\", goal=\"Collect comprehensive climate data from multiple sources.\", backstory=\"You gather climate data on temperature, carbon dioxide levels and other variables relevant to climate change, from reliable sources.\", tools=['search_tool'])data_analyst = Agent( name=\"Data Scientist\", role=\"Climate Data Scientist\", goal=\"Analyse the collected climate data to identify significant trends.\", backstory=\"You analyse climate data to find significant trends and understand the impact of various factors on climate change.\", tools=[])report_writer = Agent( name=\"Report Writer\", role=\"Senior Scientific Report Writer\", goal=\"Generate a comprehensive report on climate change findings.\", backstory=\"You write detailed scientific reports based on the analysed climate data, highlighting key findings and implications.\", tools=[])peer_reviewer = Agent( name=\"Peer Reviewer\", role=\"Scientific Peer Reviewer\", goal=\"Review the scientific report for accuracy, clarity, and completeness.\", backstory=\"You review scientific reports to ensure they are accurate, clear, and meet the standards for scientific publication.\", tools=[])final_report_writer = Agent( name=\"Final Report Writer\", role=\"Final Report Writer\", goal=\"Incorporate peer review feedback and finalize the scientific report.\", backstory=\"You finalize the scientific report by incorporating feedback from peer reviewer and ensure it is publication ready.\", tools=[])We need to split any problem into a series of tasks and assign an agent to each one. The information and expected_output is critical to getting the agent to behave and output what you desire from the agent.analyse_data = Task( info=( \"Using the following climate data, analyze for trends and patterns:\\n{previous_result}\\n\" \"1. Identify significant trends in temperature, CO2 levels, and precipitation.\\n\" \"2. Determine potential causes of observed trends.\\n\" \"3. Summarize key findings in a detailed analysis report.\" ), expected_output=\"Detailed analysis report on climate data trends and potential causes.\", agent=data_analyst, name=\"Data Analysis\")write_report = Task( info=( \"Using the following analysis report, write a comprehensive scientific report on climate change findings:\\n{previous_result}\\n\" \"1. Include an introduction, methodology, results, discussion, and conclusion.\\n\" \"2. Use clear and precise language suitable for a scientific audience.\\n\" \"3. Ensure all findings are supported by data and analysis.\" ), expected_output=\"Comprehensive scientific report on climate change findings.\", agent=report_writer, name=\"Report Writing\")review_report = Task( info=( \"Using the following scientific report, review for accuracy, clarity, and completeness:\\n{previous_result}\\n\" \"1. Ensure the report adheres to scientific standards.\\n\" \"2. Check for any errors or inaccuracies in data and analysis.\\n\" \"3. Provide feedback and suggestions for improvement.\" ), expected_output=\"Reviewed and revised scientific report, ready for publication.\", agent=peer_reviewer, name=\"Peer Review\")finalize_report = Task( info=( \"Using the following peer-reviewed report, incorporate feedback and finalize the scientific report:\\n{previous_result}\\n\" \"1. Address all feedback and suggestions provided by the peer reviewer.\\n\" \"2. Ensure the report is polished and ready for publication.\\n\" \"3. Provide the final version of the scientific report.\" ), expected_output=\"Finalized scientific report, ready for publication.\", agent=final_report_writer, name=\"Finalize Report\")Let’s bring it all together now. We can now create a system of agents and tasks and run it.ClimateResearchSystem = TheArchitect( agents=[data_collector, data_analyst, report_writer, peer_reviewer, final_report_writer], tasks=[collect_data, analyse_data, write_report, review_report, finalize_report])result = ClimateResearchSystem.process(inputs={ \"topic\": \"Climate Change\", \"search\": \"latest climate data trends\"})The final report, lets visualise the output of the multiagent system using markdown. The key question is, is this report any better than using chatgpt (no agents) to write a scientific report on climate trends?from IPython.display import MarkdownMarkdown(result['Final Report Writer'])This article could be significantly improved if you provide it with better tools; for example, providing some mathematical tools for data analysis could allow the agent to perform numerical analysis and present findings. These systems will only be as good as the tools you provide it with, which in this case, we have not provided any, besides a tool to search the internet. That brings yes to the end of the article; thank you for taking the time to read it, I hope you found it insightful! If you are interested, definitely explore these multiagent systems for a range of different problems, it’s not just about writing reports but with the right tools defined, these agents can be very powerful. This topic is still under heavy development, it is very exciting to see how it will develop.Unless otherwise noted, all images are by the author----FollowPublished in Towards AI73K Followers·Last published 3 hours agoThe leading AI community and content platform focused on making AI accessible to all. Check out our new course platform: https://academy.towardsai.net/courses/beginner-to-advanced-llm-devFollowFollowWritten by Najib Sharifi, Ph.D.404 Followers·3 FollowingPhD in Molecular Chemistry | Machine Learning Engineer.FollowNo responses yetHelpStatusAboutCareersPressBlogPrivacyTermsText to speechTeams\n",
+ "\n",
"\n",
- " arXiv.org e-Print archive\n",
+ "\n",
+ "Architectures for AI Agents: From Basic to Multi-Agent Systems All Posts Subscribe In AI — Jun 26, 2024 Architectures for AI Agents By: Stephen Pimentel 6 min read From simple to complexAI agents are systems capable of reasoning, planning, and executing tasks autonomously. Unlike more static AI models, agents interact dynamically with their environment, adapting to new information and leveraging tools to accomplish complex objectives. Let's take a look at AI agent architectures and examine key considerations for their effective use. We'll start with basic agents and then move on to multi-agent systems (MAS).Basic agentsAgents utilize a model, such as a Large Language Model (LLM), to manage and execute tasks. These systems leverage the extensive language comprehension and generation capabilities of current models. By integrating planning, reasoning, and tool execution within a single framework, agents are designed to handle a wide array of tasks autonomously. These agents typically operate in a structured manner, continuously refining their approach until they achieve the desired outcome.Basic agents excel in environments where tasks are well-defined and require minimal feedback from external sources. Their streamlined architecture makes them easier to implement and manage. This simplicity translates to higher efficiency and consistency in executing straightforward function calls. For instance, tasks such as personal news aggregation, where the system compiles and summarizes news articles based on predefined criteria, are well-suited to basic agents. The agent can independently gather data, evaluate its relevance, and refine its output, ensuring a high level of precision and control.At their most sophisticated, basic agents can integrate planning, acting, and reasoning using algorithms such as Monte Carlo Tree Search (MCTS). This method uses heuristic-based search to explore various options, and a state evaluator to choose the best action:While such architectures can produce excellent results on simpler benchmarks, they are resource-intensive and may not perform as well on more complex tasks.Despite their strengths, basic agents face significant challenges. One limitation is their propensity to get stuck in execution loops, especially when tasked with complex, multifaceted problems. Without the ability to receive feedback from other agents, a basic agent may repetitively generate the same actions, failing to progress towards the goal. Additionally, these systems may struggle with tasks requiring robust reasoning and refinement capabilities, as they lack the collaborative input that MAS provide. This limitation can lead to suboptimal outcomes, particularly in dynamic environments where adaptability and diverse perspectives are crucial.For example, in scenarios like complex event planning, where multiple aspects such as venue selection, catering, and scheduling need to be managed simultaneously, a basic agent might falter. The absence of collaborative problem-solving can result in inefficiencies and errors, pointing to the need for MAS in such contexts.Multi-Agent Systems (MAS)MAS involve multiple agents, each potentially equipped with different language models and tools, working collaboratively to solve complex tasks. These systems simulate the dynamic interactions found in human teams, where each agent can contribute uniquely based on its specialized capabilities. For example, some agents might focus on data retrieval, while others handle analysis and report generation.One of the primary strengths of MAS is their ability to handle complex tasks that require collaboration and parallel processing. This is particularly effective for problems that involve multiple distinct execution paths, where different agents can work concurrently to expedite the process. For instance, in a complex research task, one agent might gather relevant literature while another synthesizes the information, and yet another drafts a summary, all working simultaneously.Additionally, MAS can leverage diverse expertise. By integrating agents with different specializations, the system can provide more comprehensive solutions than a basic agent. This diversity fosters robust problem-solving capabilities, enabling the system to adapt and respond to varied and unexpected challenges.There are many MAS architectures, but they tend to employ two primary design principles: leader-follower and peer-to-peer. With leader-follower designs, a lead agent coordinates the activities of follower agents. This hierarchical approach ensures a clear division of labor, with each agent reporting back to the leader. While this can streamline decision-making and task allocation, it also risks creating information bottlenecks if the lead agent fails to effectively disseminate critical information.With peer-to-peer designs, all agents operate on an equal footing, sharing information and decisions via message-passing. This egalitarian approach encourages collaboration and feedback. However, it can also lead to inefficiencies if agents engage in irrelevant communication, making it important to implement filtering and prioritization mechanisms.Let's look at a few architectures making use of these principles.MAS architecturesStructured teamsAgents can be structured to work in teams, with a particular focus on organized communication and leadership. The architecture typically includes modules for configuration, perception, memory, and execution, enabling agents to translate environmental observations into actions effectively.The designated leader coordinates the actions of other agents, significantly improving task efficiency and reducing communication overhead. The leadership structure helps mitigate issues related to redundant messaging and disordered decision-making, common pitfalls in multi-agent cooperation. Structured teams can further improve their efficiency by continuously evaluating and optimizing their structure and communication patterns.Dynamic teamsMAS can be structured in dynamic teams for handling complex reasoning and code generation tasks. The architecture assigns roles to agents based on their contributions and performance, ensuring that only the most effective agents are engaged in subsequent rounds of task execution. This peer-to-peer structure, devoid of a central leader, fosters an environment where agents can share information freely and adapt their strategies in real-time.Dynamic teams allow for high flexibility and responsiveness, crucial for tasks that require continual adjustment and optimization.Phased executionAn MAS architecture can segment task execution into distinct phases, such as recruitment, decision-making, agent execution, and evaluation. This phased approach is versatile, accommodating both leader-follower and peer-to-peer structures depending on the task requirements.In the recruitment phase, agents are selected or removed based on their relevance to the task at hand. During decision-making, agents discuss and plan their approach, leveraging diverse perspectives to refine their strategy. This phase is followed by agent execution, where each agent independently performs its designated role. Finally, the evaluation phase involves assessing the outcomes and adjusting the team composition and strategies as needed.This phased approach attempts to get the right agents engaged at the right times, enhancing the overall efficiency and effectiveness of the team.Publish-subscribe communicationTo avoid unproductive chatter in MAS, a design can enforce structured outputs and utilize a publish-subscribe mechanism for information sharing. Instead of engaging in free-form conversation, agents produce structured messages, which are then shared in a controlled manner. This approach significantly reduces unnecessary communication and ensures that all agents have access to relevant information.The publish-subscribe mechanism further streamlines communication by allowing agents to subscribe only to the information pertinent to their tasks. This reduces cognitive load and improves focus, leading to more efficient task execution. Publish-subscribe communication can work particularly well in scenarios requiring extensive coordination and knowledge synthesis.Approaches to reasoningEffective AI agents must possess robust reasoning abilities to interact with complex environments, make informed decisions, and adapt to new information dynamically. Reasoning is fundamental to cognition, enabling agents to simulate human-like decision-making processes, thereby improving their problem-solving capabilities. There are several approaches to reasoning.Task decomposition involves breaking down a complex task into smaller, manageable sub-tasks. By tackling each sub-task individually, agents can simplify problem solving, making it easier to achieve the overall objective. Task decomposition is particularly useful in scenarios where tasks are inherently hierarchical or sequential.Multiple plan selection involves generating multiple potential plans for a given task and then selecting the optimal one based on predefined criteria. Multiple plan selection allows agents to explore various strategies and choose the best path forward, enhancing flexibility and adaptability.Memory-augmented planning leverages memory to retain context and historical information. This enables agents to make informed decisions based on past experiences and adapt their strategies accordingly. By storing and retrieving relevant information, agents can improve their performance in tasks that require sustained attention and contextual understanding.Agents frequently need to interact with external tools to solve complex problems, which often requires multiple iterations of reasoning, recall, and reflection. Tool-calling enhances the agent’s capabilities by providing access to specialized functions that extend beyond the built-in capabilities of a model.The advantage of parallelismMAS excel at managing parallel tasks, allowing different agents to work on separate subproblems simultaneously. This not only speeds up problem solving but ensures that tasks are handled by the agents best suited to their specific requirements. By dividing a larger problem into smaller, independent subproblems, MAS can also improve their robustness. Each agent focuses on a specific aspect of the task, and their collective efforts lead to a more comprehensive solution. This division of labor minimizes the risk of failure and enhances the system’s overall efficiency.For example, in a scenario where a system is tasked with compiling a detailed market analysis report, one agent could be responsible for gathering raw data, another for analyzing trends, and a third for drafting the report. By working in parallel, these agents can produce a more thorough and timely analysis than a basic agent.The range of MASWhile basic agents are well-suited for straightforward tasks with clearly defined tools, they often fall short in more complex and dynamic environments. MAS, on the other hand, have a broader range of capabilities, particularly excelling in collaborative and parallel task execution. The designs of MAS allow them to divide labor intelligently and adapt to feedback from both users and the environment. Effective feedback mechanisms make MAS more versatile and useful in complex problem-solving scenarios. Previous post Autonomous Decision-Making with Agentic AI Subscribe to updates from the Dragonscale Newsletter Don't miss out on the latest posts. Sign up now to get new posts sent directly to your inbox. jamie@example.com Subscribe Dragonscale Newsletter © 2025 Privacy | Terms\n",
+ "\n",
+ "==================================================\n",
"\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mWebScraper\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"\n",
+ "# Report on Multi-Agent Architecture for Complex Task Execution\n",
"\n",
+ "## Outline\n",
"\n",
+ "1. **Introduction**\n",
+ " - Definition of multi-agent systems (MAS) and their significance in solving complex tasks.\n",
+ " - Overview of the evolution of MAS and their applications in various fields.\n",
+ " - Importance of collaboration among agents in achieving task objectives.\n",
+ " - Brief mention of the structure of the report and what each section will cover.\n",
+ " - Statement of the report's objectives and the relevance of the topic in current research.\n",
"\n",
+ "2. **Background**\n",
+ " - Historical context of multi-agent systems and their development.\n",
+ " - Key concepts in MAS, including agent autonomy, communication, and cooperation.\n",
+ " - Overview of different types of agents and their roles in MAS.\n",
+ " - Discussion of the theoretical frameworks that underpin MAS, such as game theory and distributed systems.\n",
+ " - Summary of existing literature and research on MAS applications.\n",
"\n",
+ "3. **Methodology**\n",
+ " - Description of the design and implementation of a multi-agent architecture.\n",
+ " - Explanation of task decomposition and agent specialization.\n",
+ " - Overview of communication protocols and mechanisms used in MAS.\n",
+ " - Discussion of evaluation metrics for assessing the performance of MAS.\n",
+ " - Case studies illustrating the application of the methodology in real-world scenarios.\n",
"\n",
+ "4. **Applications**\n",
+ " - Exploration of various domains where MAS can be applied, such as robotics, healthcare, and smart cities.\n",
+ " - Detailed examples of successful MAS implementations in industry and research.\n",
+ " - Discussion of how MAS can enhance efficiency and effectiveness in complex task execution.\n",
+ " - Analysis of the role of MAS in emerging technologies, such as AI and IoT.\n",
+ " - Future trends and potential areas for further research in MAS applications.\n",
"\n",
+ "5. **Challenges**\n",
+ " - Identification of common challenges faced in the development and deployment of MAS.\n",
+ " - Discussion of issues related to agent coordination, communication, and conflict resolution.\n",
+ " - Examination of ethical considerations and safety concerns in MAS.\n",
+ " - Overview of technical limitations and scalability issues.\n",
+ " - Strategies for overcoming these challenges and improving MAS performance.\n",
"\n",
+ "6. **Conclusions**\n",
+ " - Summary of key findings from the report.\n",
+ " - Reflection on the significance of multi-agent architecture in solving complex tasks.\n",
+ " - Recommendations for future research directions in MAS.\n",
+ " - Final thoughts on the potential impact of MAS on society and technology.\n",
+ " - Call to action for researchers and practitioners to explore MAS further.\n",
"\n",
+ "## Detailed Content\n",
"\n",
+ "### 1. Introduction\n",
"\n",
+ "Multi-agent systems (MAS) are defined as systems composed of multiple interacting intelligent agents, capable of autonomous decision-making and task execution. The significance of MAS lies in their ability to collaboratively solve complex tasks that are beyond the capabilities of individual agents. Over the years, MAS have evolved from simple rule-based systems to sophisticated architectures that leverage advanced algorithms and machine learning techniques. The collaboration among agents is crucial, as it allows for the distribution of tasks, parallel processing, and the pooling of resources and knowledge. This report aims to provide a comprehensive overview of multi-agent architecture, focusing on its methodology, applications, challenges, and future directions.\n",
"\n",
+ "### 2. Background\n",
"\n",
+ "The historical context of multi-agent systems dates back to the early days of artificial intelligence, where researchers began exploring the potential of autonomous agents. Key concepts in MAS include agent autonomy, which refers to the ability of agents to operate independently, and communication, which is essential for coordination among agents. Different types of agents, such as reactive, deliberative, and hybrid agents, play distinct roles in MAS, contributing to their overall functionality. Theoretical frameworks, including game theory and distributed systems, provide the foundation for understanding agent interactions and decision-making processes. A review of existing literature reveals a growing interest in MAS applications across various domains, highlighting their versatility and effectiveness.\n",
"\n",
+ "### 3. Methodology\n",
"\n",
- "Skip to main content\n",
+ "The design and implementation of a multi-agent architecture involve several key steps, including task decomposition, where complex tasks are broken down into manageable subtasks assigned to specialized agents. Communication protocols, such as publish-subscribe mechanisms, facilitate information exchange among agents, ensuring that they remain informed about relevant developments. Evaluation metrics, such as task completion time and resource utilization, are essential for assessing the performance of MAS. Case studies, such as the deployment of MAS in disaster response scenarios, illustrate the practical application of these methodologies, showcasing how agents can work together to achieve common goals.\n",
"\n",
+ "### 4. Applications\n",
"\n",
+ "Multi-agent systems have found applications in diverse fields, including robotics, where they enable coordinated movements of robotic swarms, and healthcare, where they assist in patient monitoring and treatment planning. Successful implementations, such as autonomous vehicles and smart grid management, demonstrate the potential of MAS to enhance efficiency and effectiveness in complex task execution. The integration of MAS with emerging technologies, such as the Internet of Things (IoT) and artificial intelligence (AI), opens new avenues for innovation and problem-solving. Future trends indicate a growing reliance on MAS in various sectors, driven by the need for intelligent and adaptive systems.\n",
"\n",
+ "### 5. Challenges\n",
"\n",
+ "Despite their advantages, the development and deployment of multi-agent systems face several challenges. Coordination among agents can be difficult, especially in dynamic environments where tasks and conditions change rapidly. Communication issues, such as information overload and misinterpretation, can hinder agent collaboration. Ethical considerations, including privacy and security concerns, must be addressed to ensure responsible use of MAS. Technical limitations, such as scalability and computational complexity, pose additional hurdles. Strategies for overcoming these challenges include the development of robust algorithms, improved communication protocols, and ethical guidelines for agent behavior.\n",
"\n",
- "In just 3 minutes help us improve arXiv:\n",
- "Annual Global Survey\n",
+ "### 6. Conclusions\n",
"\n",
- "We gratefully acknowledge support from the Simons Foundation, member institutions, and all contributors.\n",
- "Donate\n",
+ "In conclusion, multi-agent architecture represents a powerful approach to solving complex tasks through collaboration and autonomy. The findings of this report highlight the significance of MAS in various applications and the potential for future research to address existing challenges. As technology continues to evolve, the impact of MAS on society and industry will likely grow, necessitating further exploration and innovation in this field. Researchers and practitioners are encouraged to delve deeper into the capabilities of MAS, fostering advancements that can lead to more efficient and effective solutions to real-world problems.\n",
"\n",
+ "## References\n",
"\n",
+ "- Fourney, A., Bansal, G., Mozannar, H., Dibia, V., & Amershi, S. (2024). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. *Microsoft Research*. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/articles/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
+ "- Sharifi, N. (2024). Building a Multi-Agent System to Accomplish Complex Tasks. *Towards AI*. Retrieved from [Towards AI](https://pub.towardsai.net/building-a-multi-agent-system-to-accomplish-complex-tasks-812aeedda4eb)\n",
+ "- Pimentel, S. (2024). Architectures for AI Agents: From Basic to Multi-Agent Systems. *DragonScale AI Blog*. Retrieved from [DragonScale AI](https://blog.dragonscale.ai/architectures-for-ai-agents/)\n",
"\n",
+ "---\n",
"\n",
+ "The final document will be saved as a `.md` file with all the content fully populated, including the references section in APA format at the end.\n",
+ "==================================================\n",
"\n",
- "Login\n",
- " \n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mWebScraper\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "================================\u001b[1m Human Message \u001b[0m=================================\n",
+ "Name: WebScraper\n",
"\n",
+ "# Report on Multi-Agent Architecture for Complex Task Execution\n",
"\n",
+ "## Outline\n",
"\n",
+ "1. **Introduction**\n",
+ " - Definition of multi-agent systems (MAS) and their significance in solving complex tasks.\n",
+ " - Overview of the evolution of MAS and their applications in various fields.\n",
+ " - Importance of collaboration among agents in achieving task objectives.\n",
+ " - Brief mention of the structure of the report and what each section will cover.\n",
+ " - Statement of the report's objectives and the relevance of the topic in current research.\n",
"\n",
- "Help | Advanced Search\n",
+ "2. **Background**\n",
+ " - Historical context of multi-agent systems and their development.\n",
+ " - Key concepts in MAS, including agent autonomy, communication, and cooperation.\n",
+ " - Overview of different types of agents and their roles in MAS.\n",
+ " - Discussion of the theoretical frameworks that underpin MAS, such as game theory and distributed systems.\n",
+ " - Summary of existing literature and research on MAS applications.\n",
"\n",
+ "3. **Methodology**\n",
+ " - Description of the design and implementation of a multi-agent architecture.\n",
+ " - Explanation of task decomposition and agent specialization.\n",
+ " - Overview of communication protocols and mechanisms used in MAS.\n",
+ " - Discussion of evaluation metrics for assessing the performance of MAS.\n",
+ " - Case studies illustrating the application of the methodology in real-world scenarios.\n",
"\n",
+ "4. **Applications**\n",
+ " - Exploration of various domains where MAS can be applied, such as robotics, healthcare, and smart cities.\n",
+ " - Detailed examples of successful MAS implementations in industry and research.\n",
+ " - Discussion of how MAS can enhance efficiency and effectiveness in complex task execution.\n",
+ " - Analysis of the role of MAS in emerging technologies, such as AI and IoT.\n",
+ " - Future trends and potential areas for further research in MAS applications.\n",
"\n",
+ "5. **Challenges**\n",
+ " - Identification of common challenges faced in the development and deployment of MAS.\n",
+ " - Discussion of issues related to agent coordination, communication, and conflict resolution.\n",
+ " - Examination of ethical considerations and safety concerns in MAS.\n",
+ " - Overview of technical limitations and scalability issues.\n",
+ " - Strategies for overcoming these challenges and improving MAS performance.\n",
"\n",
- "All fields\n",
- "Title\n",
- "Author\n",
- "Abstract\n",
- "Comments\n",
- "Journal reference\n",
- "ACM classification\n",
- "MSC classification\n",
- "Report number\n",
- "arXiv identifier\n",
- "DOI\n",
- "ORCID\n",
- "arXiv author ID\n",
- "Help pages\n",
- "Full text\n",
+ "6. **Conclusions**\n",
+ " - Summary of key findings from the report.\n",
+ " - Reflection on the significance of multi-agent architecture in solving complex tasks.\n",
+ " - Recommendations for future research directions in MAS.\n",
+ " - Final thoughts on the potential impact of MAS on society and technology.\n",
+ " - Call to action for researchers and practitioners to explore MAS further.\n",
"\n",
+ "## Detailed Content\n",
"\n",
+ "### 1. Introduction\n",
"\n",
+ "Multi-agent systems (MAS) are defined as systems composed of multiple interacting intelligent agents, capable of autonomous decision-making and task execution. The significance of MAS lies in their ability to collaboratively solve complex tasks that are beyond the capabilities of individual agents. Over the years, MAS have evolved from simple rule-based systems to sophisticated architectures that leverage advanced algorithms and machine learning techniques. The collaboration among agents is crucial, as it allows for the distribution of tasks, parallel processing, and the pooling of resources and knowledge. This report aims to provide a comprehensive overview of multi-agent architecture, focusing on its methodology, applications, challenges, and future directions.\n",
"\n",
- "Search\n",
+ "### 2. Background\n",
"\n",
+ "The historical context of multi-agent systems dates back to the early days of artificial intelligence, where researchers began exploring the potential of autonomous agents. Key concepts in MAS include agent autonomy, which refers to the ability of agents to operate independently, and communication, which is essential for coordination among agents. Different types of agents, such as reactive, deliberative, and hybrid agents, play distinct roles in MAS, contributing to their overall functionality. Theoretical frameworks, including game theory and distributed systems, provide the foundation for understanding agent interactions and decision-making processes. A review of existing literature reveals a growing interest in MAS applications across various domains, highlighting their versatility and effectiveness.\n",
"\n",
+ "### 3. Methodology\n",
"\n",
+ "The design and implementation of a multi-agent architecture involve several key steps, including task decomposition, where complex tasks are broken down into manageable subtasks assigned to specialized agents. Communication protocols, such as publish-subscribe mechanisms, facilitate information exchange among agents, ensuring that they remain informed about relevant developments. Evaluation metrics, such as task completion time and resource utilization, are essential for assessing the performance of MAS. Case studies, such as the deployment of MAS in disaster response scenarios, illustrate the practical application of these methodologies, showcasing how agents can work together to achieve common goals.\n",
"\n",
+ "### 4. Applications\n",
"\n",
+ "Multi-agent systems have found applications in diverse fields, including robotics, where they enable coordinated movements of robotic swarms, and healthcare, where they assist in patient monitoring and treatment planning. Successful implementations, such as autonomous vehicles and smart grid management, demonstrate the potential of MAS to enhance efficiency and effectiveness in complex task execution. The integration of MAS with emerging technologies, such as the Internet of Things (IoT) and artificial intelligence (AI), opens new avenues for innovation and problem-solving. Future trends indicate a growing reliance on MAS in various sectors, driven by the need for intelligent and adaptive systems.\n",
"\n",
+ "### 5. Challenges\n",
"\n",
+ "Despite their advantages, the development and deployment of multi-agent systems face several challenges. Coordination among agents can be difficult, especially in dynamic environments where tasks and conditions change rapidly. Communication issues, such as information overload and misinterpretation, can hinder agent collaboration. Ethical considerations, including privacy and security concerns, must be addressed to ensure responsible use of MAS. Technical limitations, such as scalability and computational complexity, pose additional hurdles. Strategies for overcoming these challenges include the development of robust algorithms, improved communication protocols, and ethical guidelines for agent behavior.\n",
"\n",
+ "### 6. Conclusions\n",
"\n",
+ "In conclusion, multi-agent architecture represents a powerful approach to solving complex tasks through collaboration and autonomy. The findings of this report highlight the significance of MAS in various applications and the potential for future research to address existing challenges. As technology continues to evolve, the impact of MAS on society and industry will likely grow, necessitating further exploration and innovation in this field. Researchers and practitioners are encouraged to delve deeper into the capabilities of MAS, fostering advancements that can lead to more efficient and effective solutions to real-world problems.\n",
"\n",
+ "## References\n",
"\n",
+ "- Fourney, A., Bansal, G., Mozannar, H., Dibia, V., & Amershi, S. (2024). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. *Microsoft Research*. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/articles/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
+ "- Sharifi, N. (2024). Building a Multi-Agent System to Accomplish Complex Tasks. *Towards AI*. Retrieved from [Towards AI](https://pub.towardsai.net/building-a-multi-agent-system-to-accomplish-complex-tasks-812aeedda4eb)\n",
+ "- Pimentel, S. (2024). Architectures for AI Agents: From Basic to Multi-Agent Systems. *DragonScale AI Blog*. Retrieved from [DragonScale AI](https://blog.dragonscale.ai/architectures-for-ai-agents/)\n",
"\n",
+ "---\n",
"\n",
+ "The final document will be saved as a `.md` file with all the content fully populated, including the references section in APA format at the end.\n",
+ "==================================================\n",
"\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "FINISH\n",
+ "==================================================\n",
"\n",
- "open search\n",
+ "==================================================\n",
+ "🔄 Node: \u001b[1;36mResearchTeam\u001b[0m 🔄\n",
+ "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
+ "================================\u001b[1m Human Message \u001b[0m=================================\n",
+ "Name: WebScraper\n",
"\n",
+ "# Report on Multi-Agent Architecture for Complex Task Execution\n",
"\n",
+ "## Outline\n",
"\n",
+ "1. **Introduction**\n",
+ " - Definition of multi-agent systems (MAS) and their significance in solving complex tasks.\n",
+ " - Overview of the evolution of MAS and their applications in various fields.\n",
+ " - Importance of collaboration among agents in achieving task objectives.\n",
+ " - Brief mention of the structure of the report and what each section will cover.\n",
+ " - Statement of the report's objectives and the relevance of the topic in current research.\n",
"\n",
+ "2. **Background**\n",
+ " - Historical context of multi-agent systems and their development.\n",
+ " - Key concepts in MAS, including agent autonomy, communication, and cooperation.\n",
+ " - Overview of different types of agents and their roles in MAS.\n",
+ " - Discussion of the theoretical frameworks that underpin MAS, such as game theory and distributed systems.\n",
+ " - Summary of existing literature and research on MAS applications.\n",
"\n",
+ "3. **Methodology**\n",
+ " - Description of the design and implementation of a multi-agent architecture.\n",
+ " - Explanation of task decomposition and agent specialization.\n",
+ " - Overview of communication protocols and mechanisms used in MAS.\n",
+ " - Discussion of evaluation metrics for assessing the performance of MAS.\n",
+ " - Case studies illustrating the application of the methodology in real-world scenarios.\n",
"\n",
- "GO\n",
+ "4. **Applications**\n",
+ " - Exploration of various domains where MAS can be applied, such as robotics, healthcare, and smart cities.\n",
+ " - Detailed examples of successful MAS implementations in industry and research.\n",
+ " - Discussion of how MAS can enhance efficiency and effectiveness in complex task execution.\n",
+ " - Analysis of the role of MAS in emerging technologies, such as AI and IoT.\n",
+ " - Future trends and potential areas for further research in MAS applications.\n",
"\n",
+ "5. **Challenges**\n",
+ " - Identification of common challenges faced in the development and deployment of MAS.\n",
+ " - Discussion of issues related to agent coordination, communication, and conflict resolution.\n",
+ " - Examination of ethical considerations and safety concerns in MAS.\n",
+ " - Overview of technical limitations and scalability issues.\n",
+ " - Strategies for overcoming these challenges and improving MAS performance.\n",
"\n",
+ "6. **Conclusions**\n",
+ " - Summary of key findings from the report.\n",
+ " - Reflection on the significance of multi-agent architecture in solving complex tasks.\n",
+ " - Recommendations for future research directions in MAS.\n",
+ " - Final thoughts on the potential impact of MAS on society and technology.\n",
+ " - Call to action for researchers and practitioners to explore MAS further.\n",
"\n",
- "open navigation menu\n",
+ "## Detailed Content\n",
"\n",
+ "### 1. Introduction\n",
"\n",
- "quick links\n",
+ "Multi-agent systems (MAS) are defined as systems composed of multiple interacting intelligent agents, capable of autonomous decision-making and task execution. The significance of MAS lies in their ability to collaboratively solve complex tasks that are beyond the capabilities of individual agents. Over the years, MAS have evolved from simple rule-based systems to sophisticated architectures that leverage advanced algorithms and machine learning techniques. The collaboration among agents is crucial, as it allows for the distribution of tasks, parallel processing, and the pooling of resources and knowledge. This report aims to provide a comprehensive overview of multi-agent architecture, focusing on its methodology, applications, challenges, and future directions.\n",
"\n",
- "Login\n",
- "Help Pages\n",
- "About\n",
+ "### 2. Background\n",
"\n",
+ "The historical context of multi-agent systems dates back to the early days of artificial intelligence, where researchers began exploring the potential of autonomous agents. Key concepts in MAS include agent autonomy, which refers to the ability of agents to operate independently, and communication, which is essential for coordination among agents. Different types of agents, such as reactive, deliberative, and hybrid agents, play distinct roles in MAS, contributing to their overall functionality. Theoretical frameworks, including game theory and distributed systems, provide the foundation for understanding agent interactions and decision-making processes. A review of existing literature reveals a growing interest in MAS applications across various domains, highlighting their versatility and effectiveness.\n",
"\n",
+ "### 3. Methodology\n",
"\n",
+ "The design and implementation of a multi-agent architecture involve several key steps, including task decomposition, where complex tasks are broken down into manageable subtasks assigned to specialized agents. Communication protocols, such as publish-subscribe mechanisms, facilitate information exchange among agents, ensuring that they remain informed about relevant developments. Evaluation metrics, such as task completion time and resource utilization, are essential for assessing the performance of MAS. Case studies, such as the deployment of MAS in disaster response scenarios, illustrate the practical application of these methodologies, showcasing how agents can work together to achieve common goals.\n",
"\n",
+ "### 4. Applications\n",
"\n",
+ "Multi-agent systems have found applications in diverse fields, including robotics, where they enable coordinated movements of robotic swarms, and healthcare, where they assist in patient monitoring and treatment planning. Successful implementations, such as autonomous vehicles and smart grid management, demonstrate the potential of MAS to enhance efficiency and effectiveness in complex task execution. The integration of MAS with emerging technologies, such as the Internet of Things (IoT) and artificial intelligence (AI), opens new avenues for innovation and problem-solving. Future trends indicate a growing reliance on MAS in various sectors, driven by the need for intelligent and adaptive systems.\n",
"\n",
+ "### 5. Challenges\n",
"\n",
+ "Despite their advantages, the development and deployment of multi-agent systems face several challenges. Coordination among agents can be difficult, especially in dynamic environments where tasks and conditions change rapidly. Communication issues, such as information overload and misinterpretation, can hinder agent collaboration. Ethical considerations, including privacy and security concerns, must be addressed to ensure responsible use of MAS. Technical limitations, such as scalability and computational complexity, pose additional hurdles. Strategies for overcoming these challenges include the development of robust algorithms, improved communication protocols, and ethical guidelines for agent behavior.\n",
"\n",
+ "### 6. Conclusions\n",
"\n",
+ "In conclusion, multi-agent architecture represents a powerful approach to solving complex tasks through collaboration and autonomy. The findings of this report highlight the significance of MAS in various applications and the potential for future research to address existing challenges. As technology continues to evolve, the impact of MAS on society and industry will likely grow, necessitating further exploration and innovation in this field. Researchers and practitioners are encouraged to delve deeper into the capabilities of MAS, fostering advancements that can lead to more efficient and effective solutions to real-world problems.\n",
"\n",
+ "## References\n",
"\n",
- "arXiv is a free distribution service and an open-access archive for nearly 2.4 million\n",
- " scholarly articles in the fields of physics, mathematics, computer science, quantitative biology, quantitative finance, statistics, electrical engineering and systems science, and economics.\n",
- " Materials on this site are not peer-reviewed by arXiv.\n",
- " \n",
+ "- Fourney, A., Bansal, G., Mozannar, H., Dibia, V., & Amershi, S. (2024). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. *Microsoft Research*. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/articles/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
+ "- Sharifi, N. (2024). Building a Multi-Agent System to Accomplish Complex Tasks. *Towards AI*. Retrieved from [Towards AI](https://pub.towardsai.net/building-a-multi-agent-system-to-accomplish-complex-tasks-812aeedda4eb)\n",
+ "- Pimentel, S. (2024). Architectures for AI Agents: From Basic to Multi-Agent Systems. *DragonScale AI Blog*. Retrieved from [DragonScale AI](https://blog.dragonscale.ai/architectures-for-ai-agents/)\n",
"\n",
- "Subject search and browse:\n",
- " \n",
- " Physics\n",
- " \n",
- " Mathematics\n",
- " \n",
- " Quantitative Biology\n",
- " \n",
- " Computer Science\n",
- " \n",
- " Quantitative Finance\n",
- " \n",
- " Statistics\n",
- " \n",
- " Electrical Engineering and Systems Science\n",
- " \n",
- " Economics\n",
- " \n",
+ "---\n",
"\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- " Physics\n",
- "\n",
- "Astrophysics\n",
- " (astro-ph\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- "\n",
- "Astrophysics of Galaxies; Cosmology and Nongalactic Astrophysics; Earth and Planetary Astrophysics; High Energy Astrophysical Phenomena; Instrumentation and Methods for Astrophysics; Solar and Stellar Astrophysics \n",
- "\n",
- "Condensed Matter\n",
- " (cond-mat\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- "\n",
- "Disordered Systems and Neural Networks; Materials Science; Mesoscale and Nanoscale Physics; Other Condensed Matter; Quantum Gases; Soft Condensed Matter; Statistical Mechanics; Strongly Correlated Electrons; Superconductivity \n",
- "\n",
- "General Relativity and Quantum Cosmology\n",
- " (gr-qc\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- "\n",
- " \n",
- "\n",
- "High Energy Physics - Experiment\n",
- " (hep-ex\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- "\n",
- " \n",
- "\n",
- "High Energy Physics - Lattice\n",
- " (hep-lat\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- "\n",
- " \n",
- "\n",
- "High Energy Physics - Phenomenology\n",
- " (hep-ph\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- "\n",
- " \n",
- "\n",
- "High Energy Physics - Theory\n",
- " (hep-th\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- "\n",
- " \n",
- "\n",
- "Mathematical Physics\n",
- " (math-ph\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- "\n",
- " \n",
- "\n",
- "Nonlinear Sciences\n",
- " (nlin\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- " includes:\n",
- "\n",
- "Adaptation and Self-Organizing Systems; Cellular Automata and Lattice Gases; Chaotic Dynamics; Exactly Solvable and Integrable Systems; Pattern Formation and Solitons \n",
- "\n",
- "Nuclear Experiment\n",
- " (nucl-ex\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- "\n",
- " \n",
- "\n",
- "Nuclear Theory\n",
- " (nucl-th\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- "\n",
- " \n",
- "\n",
- "Physics\n",
- " (physics\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- " includes:\n",
- "\n",
- "Accelerator Physics; Applied Physics; Atmospheric and Oceanic Physics; Atomic and Molecular Clusters; Atomic Physics; Biological Physics; Chemical Physics; Classical Physics; Computational Physics; Data Analysis, Statistics and Probability; Fluid Dynamics; General Physics; Geophysics; History and Philosophy of Physics; Instrumentation and Detectors; Medical Physics; Optics; Physics and Society; Physics Education; Plasma Physics; Popular Physics; Space Physics \n",
- "\n",
- "Quantum Physics\n",
- " (quant-ph\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- "\n",
- " \n",
- "\n",
- "Mathematics\n",
- "\n",
- "Mathematics\n",
- " (math\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- " includes: (see detailed description):\n",
- "\n",
- "Algebraic Geometry; Algebraic Topology; Analysis of PDEs; Category Theory; Classical Analysis and ODEs; Combinatorics; Commutative Algebra; Complex Variables; Differential Geometry; Dynamical Systems; Functional Analysis; General Mathematics; General Topology; Geometric Topology; Group Theory; History and Overview; Information Theory; K-Theory and Homology; Logic; Mathematical Physics; Metric Geometry; Number Theory; Numerical Analysis; Operator Algebras; Optimization and Control; Probability; Quantum Algebra; Representation Theory; Rings and Algebras; Spectral Theory; Statistics Theory; Symplectic Geometry \n",
- "\n",
- "Computer Science\n",
- "\n",
- "Computing Research Repository\n",
- " (CoRR\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- " includes: (see detailed description):\n",
- "\n",
- "Artificial Intelligence; Computation and Language; Computational Complexity; Computational Engineering, Finance, and Science; Computational Geometry; Computer Science and Game Theory; Computer Vision and Pattern Recognition; Computers and Society; Cryptography and Security; Data Structures and Algorithms; Databases; Digital Libraries; Discrete Mathematics; Distributed, Parallel, and Cluster Computing; Emerging Technologies; Formal Languages and Automata Theory; General Literature; Graphics; Hardware Architecture; Human-Computer Interaction; Information Retrieval; Information Theory; Logic in Computer Science; Machine Learning; Mathematical Software; Multiagent Systems; Multimedia; Networking and Internet Architecture; Neural and Evolutionary Computing; Numerical Analysis; Operating Systems; Other Computer Science; Performance; Programming Languages; Robotics; Social and Information Networks; Software Engineering; Sound; Symbolic Computation; Systems and Control \n",
- "\n",
- "Quantitative Biology\n",
- "\n",
- "Quantitative Biology\n",
- " (q-bio\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- " includes: (see detailed description):\n",
- "\n",
- "Biomolecules; Cell Behavior; Genomics; Molecular Networks; Neurons and Cognition; Other Quantitative Biology; Populations and Evolution; Quantitative Methods; Subcellular Processes; Tissues and Organs \n",
- "\n",
- "Quantitative Finance\n",
- "\n",
- "Quantitative Finance\n",
- " (q-fin\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- " includes: (see detailed description):\n",
- "\n",
- "Computational Finance; Economics; General Finance; Mathematical Finance; Portfolio Management; Pricing of Securities; Risk Management; Statistical Finance; Trading and Market Microstructure \n",
- "\n",
- "Statistics\n",
- "\n",
- "Statistics\n",
- " (stat\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- " includes: (see detailed description):\n",
- "\n",
- "Applications; Computation; Machine Learning; Methodology; Other Statistics; Statistics Theory \n",
- "\n",
- "Electrical Engineering and Systems Science\n",
- "\n",
- "Electrical Engineering and Systems Science\n",
- " (eess\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- " includes: (see detailed description):\n",
- "\n",
- "Audio and Speech Processing; Image and Video Processing; Signal Processing; Systems and Control \n",
- "\n",
- "Economics\n",
- "\n",
- "Economics\n",
- " (econ\n",
- "new,\n",
- " recent,\n",
- " search)\n",
- " includes: (see detailed description):\n",
- "\n",
- "Econometrics; General Economics; Theoretical Economics \n",
- "\n",
- "\n",
- "About arXiv\n",
- "\n",
- "General information\n",
- "How to Submit to arXiv\n",
- "Membership & Giving\n",
- "Who We Are\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "About\n",
- "Help\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "contact arXivClick here to contact arXiv\n",
- " Contact\n",
- "\n",
- "\n",
- "subscribe to arXiv mailingsClick here to subscribe\n",
- " Subscribe\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Copyright\n",
- "Privacy Policy\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Web Accessibility Assistance\n",
- "\n",
- "\n",
- "arXiv Operational Status \n",
- " Get status notifications via\n",
- " email\n",
- " or slack\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- " \n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks - Microsoft Research\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- " \n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Skip to main content\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Microsoft\n",
- "\n",
- "\n",
- "\n",
- "Research\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Research\n",
- "\n",
- "\n",
- "\n",
- "\n",
- " Research\n",
- " \n",
- "\n",
- "\n",
- "\n",
- "\n",
- " Home \n",
- "\n",
- "\n",
- "\n",
- "Our research\n",
- "\n",
- "\n",
- "Resources\n",
- "Resources\n",
- "\n",
- "\n",
- "Publications\n",
- "\n",
- "\n",
- "Code & data\n",
- "\n",
- "\n",
- "People\n",
- "\n",
- "\n",
- "Microsoft Research blog\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Research areas: Intelligence\n",
- "Research areas: Intelligence\n",
- "\n",
- "\n",
- "Artificial intelligence\n",
- "\n",
- "\n",
- "Audio & acoustics\n",
- "\n",
- "\n",
- "Computer vision\n",
- "\n",
- "\n",
- "Graphics & multimedia\n",
- "\n",
- "\n",
- "Human-computer interaction\n",
- "\n",
- "\n",
- "Human language technologies\n",
- "\n",
- "\n",
- "Search & information retrieval\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Research areas: Systems\n",
- "Research areas: Systems\n",
- "\n",
- "\n",
- "Data platforms and analytics\n",
- "\n",
- "\n",
- "Hardware & devices\n",
- "\n",
- "\n",
- "Programming languages & software engineering\n",
- "\n",
- "\n",
- "Quantum computing\n",
- "\n",
- "\n",
- "Security, privacy & cryptography\n",
- "\n",
- "\n",
- "Systems & networking\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Research areas: Theory\n",
- "Research areas: Theory\n",
- "\n",
- "\n",
- "Algorithms\n",
- "\n",
- "\n",
- "Mathematics\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Research areas: Other Sciences\n",
- "Research areas: Other Sciences\n",
- "\n",
- "\n",
- "Ecology & environment\n",
- "\n",
- "\n",
- "Economics\n",
- "\n",
- "\n",
- "Medical, health & genomics\n",
- "\n",
- "\n",
- "Social sciences\n",
- "\n",
- "\n",
- "Technology for emerging markets\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- " \n",
- "\n",
- "Programs & events\n",
- "\n",
- "\n",
- "Academic programs\n",
- "\n",
- "\n",
- "Events & academic conferences\n",
- "\n",
- "\n",
- "Microsoft Research Forum\n",
- "\n",
- "\n",
- "\n",
- " \n",
- "\n",
- "Connect & learn\n",
- "\n",
- "\n",
- "Behind the Tech podcast\n",
- "\n",
- "\n",
- "Microsoft Research blog\n",
- "\n",
- "\n",
- "Microsoft Research Forum\n",
- "\n",
- "\n",
- "Microsoft Research podcast\n",
- "\n",
- "\n",
- "\n",
- " \n",
- "\n",
- "About\n",
- "\n",
- "\n",
- "People & news\n",
- "People & news\n",
- "\n",
- "\n",
- "About Microsoft Research\n",
- "\n",
- "\n",
- "Careers & internships\n",
- "\n",
- "\n",
- "People\n",
- "\n",
- "\n",
- "Emeritus program\n",
- "\n",
- "\n",
- "News & awards\n",
- "\n",
- "\n",
- "Microsoft Research newsletter\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Microsoft Research Labs\n",
- "Microsoft Research Labs\n",
- "\n",
- "\n",
- "Africa\n",
- "\n",
- "\n",
- "AI for Science\n",
- "\n",
- "\n",
- "AI Frontiers\n",
- "\n",
- "\n",
- "Asia-Pacific\n",
- "\n",
- "\n",
- "Cambridge\n",
- "\n",
- "\n",
- "Health Futures\n",
- "\n",
- "\n",
- "India\n",
- "\n",
- "\n",
- "Montreal\n",
- "\n",
- "\n",
- "New England\n",
- "\n",
- "\n",
- "New York City\n",
- "\n",
- "\n",
- "Redmond\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Other labs\n",
- "Other labs\n",
- "\n",
- "\n",
- "Applied Sciences\n",
- "\n",
- "\n",
- "Mixed Reality & AI - Cambridge\n",
- "\n",
- "\n",
- "Mixed Reality & AI - Zurich\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "More\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Register: Research Forum\n",
- "\n",
- "\n",
- "\n",
- "\n",
- " \n",
- "\n",
- "\n",
- "\n",
- " All Microsoft\n",
- "\n",
- "\n",
- "Global\n",
- "\n",
- "\n",
- "Microsoft Security\n",
- "\n",
- "\n",
- "Azure\n",
- "\n",
- "\n",
- "Dynamics 365\n",
- "\n",
- "\n",
- "Microsoft 365\n",
- "\n",
- "\n",
- "Microsoft Teams\n",
- "\n",
- "\n",
- "Windows 365\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Tech & innovation\n",
- "Tech & innovation\n",
- "\n",
- "\n",
- "Microsoft Cloud\n",
- "\n",
- "\n",
- "AI\n",
- "\n",
- "\n",
- "Azure Space\n",
- "\n",
- "\n",
- "Mixed reality\n",
- "\n",
- "\n",
- "Microsoft HoloLens\n",
- "\n",
- "\n",
- "Microsoft Viva\n",
- "\n",
- "\n",
- "Quantum computing\n",
- "\n",
- "\n",
- "Sustainability\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Industries\n",
- "Industries\n",
- "\n",
- "\n",
- "Education\n",
- "\n",
- "\n",
- "Automotive\n",
- "\n",
- "\n",
- "Financial services\n",
- "\n",
- "\n",
- "Government\n",
- "\n",
- "\n",
- "Healthcare\n",
- "\n",
- "\n",
- "Manufacturing\n",
- "\n",
- "\n",
- "Retail\n",
- "\n",
- "\n",
- "All industries\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Partners\n",
- "Partners\n",
- "\n",
- "\n",
- "Find a partner\n",
- "\n",
- "\n",
- "Become a partner\n",
- "\n",
- "\n",
- "Partner Network\n",
- "\n",
- "\n",
- "Azure Marketplace\n",
- "\n",
- "\n",
- "AppSource\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Resources\n",
- "Resources\n",
- "\n",
- "\n",
- "Blog\n",
- "\n",
- "\n",
- "Microsoft Advertising\n",
- "\n",
- "\n",
- "Developer Center\n",
- "\n",
- "\n",
- "Documentation\n",
- "\n",
- "\n",
- "Events\n",
- "\n",
- "\n",
- "Licensing\n",
- "\n",
- "\n",
- "Microsoft Learn\n",
- "\n",
- "\n",
- "Microsoft Research\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "View Sitemap\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Search\n",
- "Search Microsoft Research\n",
- "\n",
- "\n",
- "\n",
- "\n",
- " No results\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Cancel\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- " \n",
- "\n",
- "\n",
- "\n",
- "Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Adam Fourney\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Gagan Bansal\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Hussein Mozannar\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Cheng Tan\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Eduardo Salinas\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Erkang (Eric) Zhu\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Friederike Niedtner\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "Grace Proebsting\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Griffin Bassman\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Jack Gerrits\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Jacob Alber\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Peter Chang\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Ricky Loynd\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Robert West\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Victor Dibia\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Ahmed Awadallah\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Ece Kamar\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Rafah Hosn\n",
- "\n",
- "\t\t\t\t\t\t,\t\t\n",
- "\n",
- "\n",
- "Saleema Amershi\n",
- "\n",
- "\n",
- "\n",
- "MSR-TR-2024-47 | November 2024\n",
- "Published by Microsoft\n",
- "\n",
- "\n",
- "\n",
- "Download BibTex\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Modern AI agents, driven by advances in large foundation models, promise to enhance our productivity and transform our lives by augmenting our knowledge and capabilities. To achieve this vision, AI agents must effectively plan, perform multi-step reasoning and actions, respond to novel observations, and recover from errors, to successfully complete complex tasks across a wide range of scenarios. In this work, we introduce Magentic-One, a high-performing open-source agentic system for solving such tasks. Magentic-One uses a multi-agent architecture where a lead agent, the Orchestrator, plans, tracks progress, and re-plans to recover from errors. Throughout task execution, the Orchestrator also directs other specialized agents to perform tasks as needed, such as operating a web browser, navigating local files, or writing and executing Python code. Our experiments show that Magentic-One achieves statistically competitive performance to the state-of-the-art on three diverse and challenging agentic benchmarks: GAIA, AssistantBench, and WebArena. Notably, Magentic-One achieves these results without modification to core agent capabilities or to how they collaborate, demonstrating progress towards the vision of generalist agentic systems. Moreover, Magentic-One’s modular design allows agents to be added or removed from the team without additional prompt tuning or training, easing development and making it extensible to future scenarios. We provide an open-source implementation of Magentic-One and AutoGenBench, a standalone agentic evaluation tool. AutoGenBench provides built-in controls for repetition and isolation to run agentic benchmarks where actions may produce side-effects, in a rigorous and contained way. Magentic-One, AutoGenBench and detailed empirical performance evaluations of Magentic-One, including ablations and error analysis are available at https://aka.ms/magentic-one (opens in new tab).\n",
- "Opens in a new tab \n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\t\t\t\t\t\tPreprint\t\t\t\t\t\n",
- "\n",
- "\n",
- "\t\t\t\t\t\t\tGroups\t\t\t\t\t\t\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\t\t\t\t\t\t\t\t\t\t\tHAX Team\t\t\t\t\t\t\t\t\t\t\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\t\t\t\t\t\t\tResearch Areas\t\t\t\t\t\t\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\t\t\t\t\t\t\t\t\t\t\tArtificial intelligence\t\t\t\t\t\t\t\t\t\t\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\t\t\t\t\t\t\tResearch Labs\t\t\t\t\t\t\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\t\t\t\t\t\t\t\t\t\t\tAI Frontiers\t\t\t\t\t\t\t\t\t\t\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\t\t\t\t\t\t\tFollow us:\t\t\t\t\t\t\n",
- "\n",
- "\n",
- "\n",
- "Follow on X\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Like on Facebook\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Follow on LinkedIn\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Subscribe on Youtube\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Follow on Instagram\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Subscribe to our RSS feed\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\t\t\t\t\t\t\tShare this page:\t\t\t\t\t\t\n",
- "\n",
- "\n",
- "\n",
- "Share on X\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Share on Facebook\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Share on LinkedIn\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Share on Reddit\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "What's new\n",
- "\n",
- "\n",
- "Surface Pro\n",
- "\n",
- "\n",
- "Surface Laptop\n",
- "\n",
- "\n",
- "Surface Laptop Studio 2\n",
- "\n",
- "\n",
- "Surface Laptop Go 3\n",
- "\n",
- "\n",
- "Microsoft Copilot\n",
- "\n",
- "\n",
- "AI in Windows\n",
- "\n",
- "\n",
- "Explore Microsoft products\n",
- "\n",
- "\n",
- "Windows 11 apps\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Microsoft Store\n",
- "\n",
- "\n",
- "Account profile\n",
- "\n",
- "\n",
- "Download Center\n",
- "\n",
- "\n",
- "Microsoft Store support\n",
- "\n",
- "\n",
- "Returns\n",
- "\n",
- "\n",
- "Order tracking\n",
- "\n",
- "\n",
- "Certified Refurbished\n",
- "\n",
- "\n",
- "Microsoft Store Promise\n",
- "\n",
- "\n",
- "Flexible Payments\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Education\n",
- "\n",
- "\n",
- "Microsoft in education\n",
- "\n",
- "\n",
- "Devices for education\n",
- "\n",
- "\n",
- "Microsoft Teams for Education\n",
- "\n",
- "\n",
- "Microsoft 365 Education\n",
- "\n",
- "\n",
- "How to buy for your school\n",
- "\n",
- "\n",
- "Educator training and development\n",
- "\n",
- "\n",
- "Deals for students and parents\n",
- "\n",
- "\n",
- "Azure for students\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Business\n",
- "\n",
- "\n",
- "Microsoft Cloud\n",
- "\n",
- "\n",
- "Microsoft Security\n",
- "\n",
- "\n",
- "Dynamics 365\n",
- "\n",
- "\n",
- "Microsoft 365\n",
- "\n",
- "\n",
- "Microsoft Power Platform\n",
- "\n",
- "\n",
- "Microsoft Teams\n",
- "\n",
- "\n",
- "Microsoft 365 Copilot\n",
- "\n",
- "\n",
- "Small Business\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Developer & IT\n",
- "\n",
- "\n",
- "Azure\n",
- "\n",
- "\n",
- "Microsoft Developer\n",
- "\n",
- "\n",
- "Documentation\n",
- "\n",
- "\n",
- "Microsoft Learn\n",
- "\n",
- "\n",
- "Microsoft Tech Community\n",
- "\n",
- "\n",
- "Azure Marketplace\n",
- "\n",
- "\n",
- "AppSource\n",
- "\n",
- "\n",
- "Visual Studio\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Company\n",
- "\n",
- "\n",
- "Careers\n",
- "\n",
- "\n",
- "About Microsoft\n",
- "\n",
- "\n",
- "Company news\n",
- "\n",
- "\n",
- "Privacy at Microsoft\n",
- "\n",
- "\n",
- "Investors\n",
- "\n",
- "\n",
- "Diversity and inclusion\n",
- "\n",
- "\n",
- "Accessibility\n",
- "\n",
- "\n",
- "Sustainability\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Your Privacy Choices Opt-Out Icon\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Your Privacy Choices\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Your Privacy Choices Opt-Out Icon\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Your Privacy Choices\n",
- "\n",
- "\n",
- "\n",
- "Consumer Health Privacy\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "Sitemap\n",
- "\n",
- "\n",
- "Contact Microsoft\n",
- "\n",
- "\n",
- "Privacy \n",
- "\n",
- "\n",
- "Manage cookies\n",
- "\n",
- "\n",
- "Terms of use\n",
- "\n",
- "\n",
- "Trademarks\n",
- "\n",
- "\n",
- "Safety & eco\n",
- "\n",
- "\n",
- "Recycling\n",
- "\n",
- "\n",
- "About our ads\n",
- "\n",
- "© Microsoft 2025\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- " \n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "\n",
- "RAGENTIC: RAG-Enhanced Multi-Agent Architecture | Microsoft Community HubSkip to contentTech CommunityCommunity HubsProductsTopicsBlogsEventsMicrosoft LearnLoungeRegisterSign InMicrosoft Community HubCommunitiesTopicsArtificial Intelligence and Machine LearningAI - Azure AI services BlogBlog PostAI - Azure AI services Blog 6 MIN READRAGENTIC: RAG-Enhanced Multi-Agent Architectureanilarora1983MicrosoftNov 06, 2024Artificial Intelligence (AI) has come a long way from its early days of simple algorithms and rule-based systems. Today, AI systems are not only capable of learning from data but also making decisions and taking actions autonomously. This evolution has given birth to a new paradigm in AI known as \"Agentic AI Systems.\" These systems are designed to act as agents, capable of perceiving their environment, making decisions, and executing actions to achieve specific goals. In this blog, we will explore the concept of Agentic AI Systems, their applications and we will run through a real-life use case that can be solved with RAGENTIC \n",
- " \n",
- "Let's first try to get an understanding of Agentic System and before we delve deeper into the real-life use case. Agentic AI is an AI system that is specifically designed to understand complex workflows and achieve goals autonomously, with little to no human interference. Some key characteristics of agentic AI which include autonomy, Adaptive learning and Reasoning, Language understanding, Workflow Optimization and Multi-agent and System Conversation. The building blocks of each agentic AI includes four main perspectives, Planning, Memory, Tool, and Action. Some of these stages can utilize multiple prompt engineering techniques to solve for a particular stage of a problem like COT, ReACT, Autoplan etc. These agents can also have access to multiple tools, plugins, web search etc. , internally stored documents through RAGs framework to supplement the intrinsic knowledge of the models. This tool usage extends their functionality and enables them to perform a wide range of actions \n",
- " \n",
- "There are several existing frameworks for building Agent AI System, once such is AutoGen. It is an open-source programming framework, developed by Microsoft, for building AI agents and facilitating cooperation among multiple agents to solve tasks. AutoGen aims to streamline the development and research of agentic AI, much like PyTorch does for Deep Learning. It offers features such as agents capable of interacting with each other, facilitates the use of various large language models (LLMs) and tool use support, autonomous and human-in-the-loop workflows, and multi-agent conversation patterns. \n",
- " \n",
- "\n",
- "To substantiate the power of multiagent systems, let's look at a real-life use case when RAGs frameworks which once enhanced with AutoGen framework can result in revolutionize this technology even further. \n",
- "Use case Demonstration: In today's digital age, mobile applications have become a cornerstone for many organizations seeking to offer a seamless shopping experience. However, while attracting customers to download and use an app is a significant milestone, retaining those customers poses a myriad of challenges. The challenges range from \n",
- "\n",
- "Lack of Personalization: Customers expect personalized experiences based on their preferences and past behaviour. Generic recommendations or irrelevant promotions can make users feel undervalued. \n",
- "\n",
- "\n",
- "Data Privacy Concerns: Striking a balance between personalization and privacy is delicate. Overstepping boundaries can lead to mistrust and potential churn. \n",
- "\n",
- "\n",
- "Inconsistent Communication: Regular, relevant communication is key. However, over-communication can overwhelm users, while under-communication can make them feel neglected. \n",
- "\n",
- "\n",
- "Numerous Alternatives: With countless apps available, customers can easily switch if they're dissatisfied. Brands must continuously innovate to stay ahead of competitors. \n",
- "\n",
- "\n",
- "Self-Service Options: Modern customers prefer solving issues on their own. Providing robust self-service options within the app can enhance user satisfaction \n",
- "\n",
- "\n",
- "Ignoring User Feedback: Actively seeking and acting on user feedback can help identify pain points and areas for improvement. Ignoring feedback can lead to unresolved issues and customer churn. \n",
- "\n",
- "The Solution: Agentic AI Systems \n",
- "Agentic AI systems offer a robust way to address customer retention challenges in the digital marketplace. By improving user experience, enhancing personalization, streamlining communication, and ensuring data security, these systems can boost customer satisfaction and loyalty. \n",
- "RAGENTIC, in particular, revolutionizes information processing by seamlessly integrating RAG (Retrieval-Augmented Generation) with autonomous agent networks. At its core lies a Master Agent that orchestrates specialized agents, each enhanced with RAG capabilities for contextual decision-making. The architecture enables efficient data retrieval, processing, and autonomous task execution while maintaining security through its Assistant API layer. This integration produces a robust system that can handle complex workflows while staying grounded in accurate, retrieved information.\n",
- "Key Benefits:\n",
- "\n",
- "Enhanced accuracy through RAG-powered contextual understanding\n",
- "Autonomous operation with minimal human intervention\n",
- "Seamless scalability through modular agent architecture\n",
- "Improved error handling and recovery mechanisms\n",
- "Real-time data processing and decision making\n",
- "\n",
- "Agentic AI System Design :\n",
- "To effectively address these challenges, we can design specific agents for tasks such as Market Basket Analysis, Clickstream Analysis, Inventory Analysis, Customer Analysis, and Product Recommendations. These agents analyze data from various sources, with a Master Agent coordinating the interaction among them and providing a personalized recommendation to the user. \n",
- "Each specialized agent in the system is designed to handle a specific task, utilizing the Assistant API to perform database operations, retrieve data, and process results. Here’s an overview of how these agent's function: \n",
- "1. Market Basket Analysis Agent \n",
- "\n",
- "Task: Analyzes transactional data to identify patterns in items frequently purchased together. \n",
- "\n",
- "\n",
- "Working Mechanism: This agent uses the Assistant API to create custom SQL queries, extracting data from transaction logs and analyzing purchase correlations. \n",
- "\n",
- "\n",
- "Example: The agent queries the database for customer purchase history, applies pattern recognition algorithms, and outputs frequently bundled product sets. \n",
- "\n",
- "2. Clickstream Analysis Agent \n",
- "\n",
- "Task: Examines user navigation paths to track behavior and product engagement. \n",
- "\n",
- "\n",
- "Working Mechanism: This agent leverages the Assistant API to access and parse user session data, constructing insights like heatmaps and user journey paths. \n",
- "\n",
- "\n",
- "Example: Using SQL, the agent pulls detailed clickstream data, which helps identify high-traffic areas and popular product pages. \n",
- "\n",
- "3. Inventory Analysis Agent \n",
- "\n",
- "Task: Monitors stock levels, predicts potential shortages, and optimizes restocking plans. \n",
- "\n",
- "\n",
- "Working Mechanism: The Assistant API enables the agent to access inventory databases and sales records, allowing it to forecast future inventory needs. \n",
- "\n",
- "\n",
- "Example: The agent queries current stock levels and combines this data with historical sales trends to make restocking recommendations. \n",
- "\n",
- "4. Customer Analysis Agent \n",
- "\n",
- "Task: Profiles customer demographics and purchasing preferences to understand user behavior. \n",
- "\n",
- "\n",
- "Working Mechanism: This agent uses the Assistant API to pull data from CRM systems and customer feedback, analyzing user attributes and segmenting customers based on their behavior. \n",
- "\n",
- "\n",
- "Example: By querying customer profiles and purchase records, the agent identifies high-value customer segments for targeted marketing. \n",
- "\n",
- "5. Product Recommendation Agent \n",
- "\n",
- "Task: Generates personalized product recommendations by synthesizing outputs from other agents. \n",
- "\n",
- "\n",
- "Working Mechanism: The agent consolidates insights from the Market Basket Analysis, Clickstream Analysis, and Customer Analysis agents through the Assistant API, generating tailored suggestions. \n",
- "\n",
- "\n",
- "Example: The agent collects data on popular products and customer preferences to recommend items that align with a user’s past behavior and interests. \n",
- "\n",
- " We also have a Master Agent which is orchestrating the whole agentic system \n",
- "\n",
- "Role: The Master Agent manages the coordination between specialized agents, consolidates their outputs, and delivers a unified response to the user. \n",
- "\n",
- "\n",
- "Workflow: \n",
- "\n",
- "Receives user input or system-triggered requests. \n",
- "Assigns tasks to the relevant specialized agents. \n",
- "Collects and integrates the results from each agent. \n",
- "Processes the combined data to create a personalized recommendation. \n",
- "Returns the response to the user. \n",
- "\n",
- "\n",
- "\n",
- "Below is an illustrative Python code snippet demonstrating how these agents could be coordinated using AutoGen: \n",
- " \n",
- "\n",
- "We are also utilizing Assistant API which serves as a bridge between agents and the data they need to access. By leveraging this API, agents can: \n",
- "\n",
- "Write Custom SQL Queries: Agents dynamically construct and execute SQL queries to extract and process specific data based on their tasks. \n",
- "\n",
- "\n",
- "Ensure Data Security: The API includes authentication protocols to safeguard data access, ensuring that only authorized agents can interact with databases. \n",
- "\n",
- "\n",
- "Streamline Data Processing: The API simplifies the connection between agents and various database types, facilitating quick and accurate data retrieval. \n",
- "\n",
- "\n",
- "Enhance Collaboration: Results generated by one agent can be shared with other agents through API calls, promoting collaboration and cohesive decision-making. \n",
- "\n",
- " \n",
- "Now let's look at an example of how the User Journey on the application is personalized with the above agentic system and how the specialized agents work behind the scenes: \n",
- "Scenario: Jane Doe, a frequent shopper, logs into her favorite e-commerce app. She browses through a few categories but is unsure about what to purchase.\n",
- "\n",
- "Agent Interaction: \n",
- "\n",
- "The Clickstream Analysis Agent tracks her navigation path and identifies her interest in specific categories. \n",
- "The Market Basket Analysis Agent runs MBA machine learning engine to checks for products she previously bought together and suggests related items. \n",
- "The Customer Analysis Agent leverages her purchase history to identify trends and preferred product types. \n",
- "The Inventory Analysis Agent ensures that the recommended items are in stock and notifies her of limited availability. \n",
- "The Product Recommendation Agent, with input from the other agents, provides tailored suggestions, highlighting items that fit her taste and are in stock. \n",
- "\n",
- "\n",
- "\n",
- " In the below screenshot the graph shows what agents are invoked by the Master Agent to provide hyper personalized recommendation in ZMOT and the recommendation generated for the user.\n",
- "\n",
- " As last thoughts, the combination of RAG frameworks and Agentic AI can revolutionize user engagement by delivering personalization, improving communication, customer retention and ensuring quick problem resolution. This integration results in better customer satisfaction, loyalty, and ultimately business growth. \n",
- " Updated Nov 06, 2024Version 1.0azure ai servicesazure ai studioazure openai serviceCommentCommentanilarora1983MicrosoftJoined July 24, 2024Send MessageView ProfileAI - Azure AI services Blog Follow this blog board to get notified when there's new activityShare What's new Surface Pro 9 Surface Laptop 5 Surface Studio 2+ Surface Laptop Go 2 Surface Laptop Studio Surface Duo 2 Microsoft 365 Windows 11 apps Microsoft Store Account profile Download Center Microsoft Store support Returns Order tracking Virtual workshops and training Microsoft Store Promise Flexible Payments Education Microsoft in education Devices for education Microsoft Teams for Education Microsoft 365 Education Education consultation appointment Educator training and development Deals for students and parents Azure for students Business Microsoft Cloud Microsoft Security Dynamics 365 Microsoft 365 Microsoft Power Platform Microsoft Teams Microsoft Industry Small Business Developer & IT Azure Developer Center Documentation Microsoft Learn Microsoft Tech Community Azure Marketplace AppSource Visual Studio Company Careers About Microsoft Company news Privacy at Microsoft Investors Diversity and inclusion Accessibility Sustainability California Consumer Privacy Act (CCPA) Opt-Out Icon Your Privacy Choices Sitemap Contact Microsoft Privacy Manage cookies Terms of use Trademarks Safety & eco About our ads © Microsoft 2024 \n",
- "\n",
- "==================================================\n",
+ "The final document will be saved as a `.md` file with all the content fully populated, including the references section in APA format at the end.\n",
+ "==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mWebScraper\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
- "\n",
- "# Report on Multi-Agent Architecture for Complex Task Performance\n",
- "\n",
- "## Abstract\n",
- "This report explores the utilization of multi-agent architecture in performing complex tasks. It provides a comprehensive overview of the theoretical foundations, methodologies, applications, and challenges associated with multi-agent systems (MAS). The report also includes case studies and examples that illustrate the practical applications of MAS in real-world scenarios. By examining the capabilities and limitations of multi-agent systems, this report aims to contribute to the understanding of how these systems can be effectively employed to solve intricate problems across various domains.\n",
- "\n",
- "## 1. Introduction\n",
- "Multi-agent architecture refers to a system composed of multiple interacting intelligent agents, each capable of autonomous decision-making. The increasing complexity of tasks in various fields, such as robotics, healthcare, and finance, necessitates the development of sophisticated systems that can operate collaboratively. This report aims to elucidate the principles of multi-agent systems and their application in addressing complex tasks. The significance of this study lies in its potential to enhance efficiency, adaptability, and problem-solving capabilities in dynamic environments. By leveraging the strengths of individual agents, multi-agent systems can achieve outcomes that surpass the capabilities of single-agent systems.\n",
- "\n",
- "## 2. Background\n",
- "The concept of multi-agent systems has its roots in artificial intelligence and distributed computing. Agents in a multi-agent system can be defined as autonomous entities that perceive their environment and act upon it to achieve specific goals. The interactions among agents can be cooperative, competitive, or a mix of both, depending on the task at hand. Key theories underpinning MAS include game theory, distributed problem-solving, and social choice theory. These theories provide insights into how agents can negotiate, share information, and coordinate their actions to optimize performance. The evolution of MAS has been driven by advancements in machine learning, communication protocols, and computational power, enabling the development of more sophisticated and capable agents.\n",
- "\n",
- "## 3. Methodology\n",
- "The methodology for implementing a multi-agent architecture involves several key steps. First, the task must be decomposed into manageable subtasks that can be assigned to individual agents. This decomposition is often guided by the capabilities and expertise of the agents involved. Next, agents must be designed to communicate effectively, sharing information and coordinating their actions to achieve the overall goal. The use of protocols such as Contract Net Protocol (CNP) and Auction-based mechanisms can facilitate this communication. Additionally, agents must be equipped with learning algorithms that allow them to adapt to changing environments and improve their performance over time. Finally, the system must be evaluated to assess its effectiveness in solving the complex task, often using metrics such as completion time, resource utilization, and accuracy.\n",
- "\n",
- "## 4. Applications\n",
- "Multi-agent systems have a wide range of applications across various domains. In robotics, MAS can be employed for tasks such as coordinated exploration, where multiple robots work together to map an unknown environment. In healthcare, agents can assist in patient management by coordinating care among different providers and ensuring that patients receive timely interventions. Financial markets also benefit from MAS, where agents can analyze market data and execute trades based on collective intelligence. Furthermore, in smart cities, multi-agent systems can optimize traffic flow and energy consumption by coordinating the actions of various infrastructure components. Each of these applications demonstrates the versatility and effectiveness of multi-agent architecture in addressing complex challenges.\n",
- "\n",
- "## 5. Challenges\n",
- "Despite the advantages of multi-agent systems, several challenges must be addressed to ensure their successful implementation. One significant challenge is the issue of agent coordination, particularly in dynamic environments where tasks and conditions may change rapidly. Ensuring that agents can adapt to these changes while maintaining effective communication is crucial. Additionally, the complexity of designing agents that can operate autonomously while adhering to ethical guidelines poses another challenge. Security concerns, such as the potential for agents to be manipulated or compromised, must also be considered. Finally, the integration of multi-agent systems with existing technologies and infrastructures can present technical hurdles that need to be overcome.\n",
- "\n",
- "## 6. Conclusions\n",
- "In conclusion, multi-agent architecture offers a powerful framework for addressing complex tasks across various domains. By leveraging the capabilities of multiple autonomous agents, systems can achieve greater efficiency, adaptability, and problem-solving capabilities. However, the successful implementation of MAS requires careful consideration of coordination, ethical guidelines, security, and integration challenges. Future research should focus on developing robust frameworks and methodologies that enhance the effectiveness of multi-agent systems in real-world applications. As technology continues to evolve, the potential for multi-agent systems to revolutionize complex task performance remains significant.\n",
- "\n",
- "## References\n",
- "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. Microsoft Research. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
- "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. arXiv. Retrieved from [arXiv](https://arxiv.org/abs/2411.04468)\n",
- "- Microsoft Tech Community. (2023). Ragentic: RAG Enhanced Multi-Agent Architecture. Retrieved from [Microsoft Tech Community](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132)\n",
- "\n",
- "---\n",
- "\n",
- "The above content is structured in a markdown format suitable for an Arxiv paper. The sections are fully populated with detailed explanations, and references are provided in APA format. Diagrams and charts can be created separately to illustrate complex ideas, such as agent relationships and task processes, as needed. The final document can be saved as a .md file.\n",
- "==================================================\n",
- "\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mWebScraper\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "================================\u001b[1m Human Message \u001b[0m=================================\n",
- "Name: WebScraper\n",
- "\n",
- "# Report on Multi-Agent Architecture for Complex Task Performance\n",
- "\n",
- "## Abstract\n",
- "This report explores the utilization of multi-agent architecture in performing complex tasks. It provides a comprehensive overview of the theoretical foundations, methodologies, applications, and challenges associated with multi-agent systems (MAS). The report also includes case studies and examples that illustrate the practical applications of MAS in real-world scenarios. By examining the capabilities and limitations of multi-agent systems, this report aims to contribute to the understanding of how these systems can be effectively employed to solve intricate problems across various domains.\n",
- "\n",
- "## 1. Introduction\n",
- "Multi-agent architecture refers to a system composed of multiple interacting intelligent agents, each capable of autonomous decision-making. The increasing complexity of tasks in various fields, such as robotics, healthcare, and finance, necessitates the development of sophisticated systems that can operate collaboratively. This report aims to elucidate the principles of multi-agent systems and their application in addressing complex tasks. The significance of this study lies in its potential to enhance efficiency, adaptability, and problem-solving capabilities in dynamic environments. By leveraging the strengths of individual agents, multi-agent systems can achieve outcomes that surpass the capabilities of single-agent systems.\n",
- "\n",
- "## 2. Background\n",
- "The concept of multi-agent systems has its roots in artificial intelligence and distributed computing. Agents in a multi-agent system can be defined as autonomous entities that perceive their environment and act upon it to achieve specific goals. The interactions among agents can be cooperative, competitive, or a mix of both, depending on the task at hand. Key theories underpinning MAS include game theory, distributed problem-solving, and social choice theory. These theories provide insights into how agents can negotiate, share information, and coordinate their actions to optimize performance. The evolution of MAS has been driven by advancements in machine learning, communication protocols, and computational power, enabling the development of more sophisticated and capable agents.\n",
- "\n",
- "## 3. Methodology\n",
- "The methodology for implementing a multi-agent architecture involves several key steps. First, the task must be decomposed into manageable subtasks that can be assigned to individual agents. This decomposition is often guided by the capabilities and expertise of the agents involved. Next, agents must be designed to communicate effectively, sharing information and coordinating their actions to achieve the overall goal. The use of protocols such as Contract Net Protocol (CNP) and Auction-based mechanisms can facilitate this communication. Additionally, agents must be equipped with learning algorithms that allow them to adapt to changing environments and improve their performance over time. Finally, the system must be evaluated to assess its effectiveness in solving the complex task, often using metrics such as completion time, resource utilization, and accuracy.\n",
- "\n",
- "## 4. Applications\n",
- "Multi-agent systems have a wide range of applications across various domains. In robotics, MAS can be employed for tasks such as coordinated exploration, where multiple robots work together to map an unknown environment. In healthcare, agents can assist in patient management by coordinating care among different providers and ensuring that patients receive timely interventions. Financial markets also benefit from MAS, where agents can analyze market data and execute trades based on collective intelligence. Furthermore, in smart cities, multi-agent systems can optimize traffic flow and energy consumption by coordinating the actions of various infrastructure components. Each of these applications demonstrates the versatility and effectiveness of multi-agent architecture in addressing complex challenges.\n",
- "\n",
- "## 5. Challenges\n",
- "Despite the advantages of multi-agent systems, several challenges must be addressed to ensure their successful implementation. One significant challenge is the issue of agent coordination, particularly in dynamic environments where tasks and conditions may change rapidly. Ensuring that agents can adapt to these changes while maintaining effective communication is crucial. Additionally, the complexity of designing agents that can operate autonomously while adhering to ethical guidelines poses another challenge. Security concerns, such as the potential for agents to be manipulated or compromised, must also be considered. Finally, the integration of multi-agent systems with existing technologies and infrastructures can present technical hurdles that need to be overcome.\n",
- "\n",
- "## 6. Conclusions\n",
- "In conclusion, multi-agent architecture offers a powerful framework for addressing complex tasks across various domains. By leveraging the capabilities of multiple autonomous agents, systems can achieve greater efficiency, adaptability, and problem-solving capabilities. However, the successful implementation of MAS requires careful consideration of coordination, ethical guidelines, security, and integration challenges. Future research should focus on developing robust frameworks and methodologies that enhance the effectiveness of multi-agent systems in real-world applications. As technology continues to evolve, the potential for multi-agent systems to revolutionize complex task performance remains significant.\n",
- "\n",
- "## References\n",
- "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. Microsoft Research. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
- "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. arXiv. Retrieved from [arXiv](https://arxiv.org/abs/2411.04468)\n",
- "- Microsoft Tech Community. (2023). Ragentic: RAG Enhanced Multi-Agent Architecture. Retrieved from [Microsoft Tech Community](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132)\n",
- "\n",
- "---\n",
- "\n",
- "The above content is structured in a markdown format suitable for an Arxiv paper. The sections are fully populated with detailed explanations, and references are provided in APA format. Diagrams and charts can be created separately to illustrate complex ideas, such as agent relationships and task processes, as needed. The final document can be saved as a .md file.\n",
+ "\u001b[1;32mnext\u001b[0m:\n",
+ "PaperWritingTeam\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mResearchTeam\u001b[0m] 🔄\n",
+ "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mPaperWritingTeam\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"\u001b[1;32mnext\u001b[0m:\n",
- "FINISH\n",
+ "DocWriter\n",
"==================================================\n",
"\n",
"==================================================\n",
- "🔄 Node: \u001b[1;36mResearchTeam\u001b[0m 🔄\n",
+ "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "================================\u001b[1m Human Message \u001b[0m=================================\n",
- "Name: WebScraper\n",
- "\n",
- "# Report on Multi-Agent Architecture for Complex Task Performance\n",
+ "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
+ "Tool Calls:\n",
+ " write_document (call_ESdSjUCkKRe2U3aSZBc4DZCq)\n",
+ " Call ID: call_ESdSjUCkKRe2U3aSZBc4DZCq\n",
+ " Args:\n",
+ " content: # Report on Multi-Agent Architecture for Complex Task Execution\n",
"\n",
- "## Abstract\n",
- "This report explores the utilization of multi-agent architecture in performing complex tasks. It provides a comprehensive overview of the theoretical foundations, methodologies, applications, and challenges associated with multi-agent systems (MAS). The report also includes case studies and examples that illustrate the practical applications of MAS in real-world scenarios. By examining the capabilities and limitations of multi-agent systems, this report aims to contribute to the understanding of how these systems can be effectively employed to solve intricate problems across various domains.\n",
+ "## Outline\n",
"\n",
- "## 1. Introduction\n",
- "Multi-agent architecture refers to a system composed of multiple interacting intelligent agents, each capable of autonomous decision-making. The increasing complexity of tasks in various fields, such as robotics, healthcare, and finance, necessitates the development of sophisticated systems that can operate collaboratively. This report aims to elucidate the principles of multi-agent systems and their application in addressing complex tasks. The significance of this study lies in its potential to enhance efficiency, adaptability, and problem-solving capabilities in dynamic environments. By leveraging the strengths of individual agents, multi-agent systems can achieve outcomes that surpass the capabilities of single-agent systems.\n",
+ "1. **Introduction** \n",
+ " - Definition of multi-agent systems (MAS) and their significance in solving complex tasks. \n",
+ " - Overview of the evolution of MAS and their applications in various fields. \n",
+ " - Importance of collaboration among agents in achieving task objectives. \n",
+ " - Brief mention of the structure of the report and what each section will cover. \n",
+ " - Statement of the report's objectives and the relevance of the topic in current research.\n",
"\n",
- "## 2. Background\n",
- "The concept of multi-agent systems has its roots in artificial intelligence and distributed computing. Agents in a multi-agent system can be defined as autonomous entities that perceive their environment and act upon it to achieve specific goals. The interactions among agents can be cooperative, competitive, or a mix of both, depending on the task at hand. Key theories underpinning MAS include game theory, distributed problem-solving, and social choice theory. These theories provide insights into how agents can negotiate, share information, and coordinate their actions to optimize performance. The evolution of MAS has been driven by advancements in machine learning, communication protocols, and computational power, enabling the development of more sophisticated and capable agents.\n",
+ "2. **Background** \n",
+ " - Historical context of multi-agent systems and their development. \n",
+ " - Key concepts in MAS, including agent autonomy, communication, and cooperation. \n",
+ " - Overview of different types of agents and their roles in MAS. \n",
+ " - Discussion of the theoretical frameworks that underpin MAS, such as game theory and distributed systems. \n",
+ " - Summary of existing literature and research on MAS applications.\n",
"\n",
- "## 3. Methodology\n",
- "The methodology for implementing a multi-agent architecture involves several key steps. First, the task must be decomposed into manageable subtasks that can be assigned to individual agents. This decomposition is often guided by the capabilities and expertise of the agents involved. Next, agents must be designed to communicate effectively, sharing information and coordinating their actions to achieve the overall goal. The use of protocols such as Contract Net Protocol (CNP) and Auction-based mechanisms can facilitate this communication. Additionally, agents must be equipped with learning algorithms that allow them to adapt to changing environments and improve their performance over time. Finally, the system must be evaluated to assess its effectiveness in solving the complex task, often using metrics such as completion time, resource utilization, and accuracy.\n",
+ "3. **Methodology** \n",
+ " - Description of the design and implementation of a multi-agent architecture. \n",
+ " - Explanation of task decomposition and agent specialization. \n",
+ " - Overview of communication protocols and mechanisms used in MAS. \n",
+ " - Discussion of evaluation metrics for assessing the performance of MAS. \n",
+ " - Case studies illustrating the application of the methodology in real-world scenarios.\n",
"\n",
- "## 4. Applications\n",
- "Multi-agent systems have a wide range of applications across various domains. In robotics, MAS can be employed for tasks such as coordinated exploration, where multiple robots work together to map an unknown environment. In healthcare, agents can assist in patient management by coordinating care among different providers and ensuring that patients receive timely interventions. Financial markets also benefit from MAS, where agents can analyze market data and execute trades based on collective intelligence. Furthermore, in smart cities, multi-agent systems can optimize traffic flow and energy consumption by coordinating the actions of various infrastructure components. Each of these applications demonstrates the versatility and effectiveness of multi-agent architecture in addressing complex challenges.\n",
+ "4. **Applications** \n",
+ " - Exploration of various domains where MAS can be applied, such as robotics, healthcare, and smart cities. \n",
+ " - Detailed examples of successful MAS implementations in industry and research. \n",
+ " - Discussion of how MAS can enhance efficiency and effectiveness in complex task execution. \n",
+ " - Analysis of the role of MAS in emerging technologies, such as AI and IoT. \n",
+ " - Future trends and potential areas for further research in MAS applications.\n",
"\n",
- "## 5. Challenges\n",
- "Despite the advantages of multi-agent systems, several challenges must be addressed to ensure their successful implementation. One significant challenge is the issue of agent coordination, particularly in dynamic environments where tasks and conditions may change rapidly. Ensuring that agents can adapt to these changes while maintaining effective communication is crucial. Additionally, the complexity of designing agents that can operate autonomously while adhering to ethical guidelines poses another challenge. Security concerns, such as the potential for agents to be manipulated or compromised, must also be considered. Finally, the integration of multi-agent systems with existing technologies and infrastructures can present technical hurdles that need to be overcome.\n",
+ "5. **Challenges** \n",
+ " - Identification of common challenges faced in the development and deployment of MAS. \n",
+ " - Discussion of issues related to agent coordination, communication, and conflict resolution. \n",
+ " - Examination of ethical considerations and safety concerns in MAS. \n",
+ " - Overview of technical limitations and scalability issues. \n",
+ " - Strategies for overcoming these challenges and improving MAS performance.\n",
"\n",
- "## 6. Conclusions\n",
- "In conclusion, multi-agent architecture offers a powerful framework for addressing complex tasks across various domains. By leveraging the capabilities of multiple autonomous agents, systems can achieve greater efficiency, adaptability, and problem-solving capabilities. However, the successful implementation of MAS requires careful consideration of coordination, ethical guidelines, security, and integration challenges. Future research should focus on developing robust frameworks and methodologies that enhance the effectiveness of multi-agent systems in real-world applications. As technology continues to evolve, the potential for multi-agent systems to revolutionize complex task performance remains significant.\n",
+ "6. **Conclusions** \n",
+ " - Summary of key findings from the report. \n",
+ " - Reflection on the significance of multi-agent architecture in solving complex tasks. \n",
+ " - Recommendations for future research directions in MAS. \n",
+ " - Final thoughts on the potential impact of MAS on society and technology. \n",
+ " - Call to action for researchers and practitioners to explore MAS further.\n",
"\n",
- "## References\n",
- "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. Microsoft Research. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
- "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. arXiv. Retrieved from [arXiv](https://arxiv.org/abs/2411.04468)\n",
- "- Microsoft Tech Community. (2023). Ragentic: RAG Enhanced Multi-Agent Architecture. Retrieved from [Microsoft Tech Community](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132)\n",
+ "## Detailed Content\n",
"\n",
- "---\n",
+ "### 1. Introduction \n",
"\n",
- "The above content is structured in a markdown format suitable for an Arxiv paper. The sections are fully populated with detailed explanations, and references are provided in APA format. Diagrams and charts can be created separately to illustrate complex ideas, such as agent relationships and task processes, as needed. The final document can be saved as a .md file.\n",
- "==================================================\n",
+ "Multi-agent systems (MAS) are defined as systems composed of multiple interacting intelligent agents, capable of autonomous decision-making and task execution. The significance of MAS lies in their ability to collaboratively solve complex tasks that are beyond the capabilities of individual agents. Over the years, MAS have evolved from simple rule-based systems to sophisticated architectures that leverage advanced algorithms and machine learning techniques. The collaboration among agents is crucial, as it allows for the distribution of tasks, parallel processing, and the pooling of resources and knowledge. This report aims to provide a comprehensive overview of multi-agent architecture, focusing on its methodology, applications, challenges, and future directions.\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mSupervisor\u001b[0m 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "\u001b[1;32mnext\u001b[0m:\n",
- "PaperWritingTeam\n",
- "==================================================\n",
+ "### 2. Background \n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36mSupervisor\u001b[0m in [\u001b[1;33mPaperWritingTeam\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "\u001b[1;32mnext\u001b[0m:\n",
- "DocWriter\n",
- "==================================================\n",
+ "The historical context of multi-agent systems dates back to the early days of artificial intelligence, where researchers began exploring the potential of autonomous agents. Key concepts in MAS include agent autonomy, which refers to the ability of agents to operate independently, and communication, which is essential for coordination among agents. Different types of agents, such as reactive, deliberative, and hybrid agents, play distinct roles in MAS, contributing to their overall functionality. Theoretical frameworks, including game theory and distributed systems, provide the foundation for understanding agent interactions and decision-making processes. A review of existing literature reveals a growing interest in MAS applications across various domains, highlighting their versatility and effectiveness.\n",
"\n",
- "==================================================\n",
- "🔄 Node: \u001b[1;36magent\u001b[0m in [\u001b[1;33mDocWriter\u001b[0m] 🔄\n",
- "- - - - - - - - - - - - - - - - - - - - - - - - - \n",
- "==================================\u001b[1m Ai Message \u001b[0m==================================\n",
- "Tool Calls:\n",
- " write_document (call_YHdDjHG0twLnx5i9eXuPOPbK)\n",
- " Call ID: call_YHdDjHG0twLnx5i9eXuPOPbK\n",
- " Args:\n",
- " content: # Report on Multi-Agent Architecture for Complex Task Performance\n",
+ "### 3. Methodology \n",
"\n",
- "## Abstract\n",
- "This report explores the utilization of multi-agent architecture in performing complex tasks. It provides a comprehensive overview of the theoretical foundations, methodologies, applications, and challenges associated with multi-agent systems (MAS). The report also includes case studies and examples that illustrate the practical applications of MAS in real-world scenarios. By examining the capabilities and limitations of multi-agent systems, this report aims to contribute to the understanding of how these systems can be effectively employed to solve intricate problems across various domains.\n",
+ "The design and implementation of a multi-agent architecture involve several key steps, including task decomposition, where complex tasks are broken down into manageable subtasks assigned to specialized agents. Communication protocols, such as publish-subscribe mechanisms, facilitate information exchange among agents, ensuring that they remain informed about relevant developments. Evaluation metrics, such as task completion time and resource utilization, are essential for assessing the performance of MAS. Case studies, such as the deployment of MAS in disaster response scenarios, illustrate the practical application of these methodologies, showcasing how agents can work together to achieve common goals.\n",
"\n",
- "## 1. Introduction\n",
- "Multi-agent architecture refers to a system composed of multiple interacting intelligent agents, each capable of autonomous decision-making. The increasing complexity of tasks in various fields, such as robotics, healthcare, and finance, necessitates the development of sophisticated systems that can operate collaboratively. This report aims to elucidate the principles of multi-agent systems and their application in addressing complex tasks. The significance of this study lies in its potential to enhance efficiency, adaptability, and problem-solving capabilities in dynamic environments. By leveraging the strengths of individual agents, multi-agent systems can achieve outcomes that surpass the capabilities of single-agent systems.\n",
+ "### 4. Applications \n",
"\n",
- "## 2. Background\n",
- "The concept of multi-agent systems has its roots in artificial intelligence and distributed computing. Agents in a multi-agent system can be defined as autonomous entities that perceive their environment and act upon it to achieve specific goals. The interactions among agents can be cooperative, competitive, or a mix of both, depending on the task at hand. Key theories underpinning MAS include game theory, distributed problem-solving, and social choice theory. These theories provide insights into how agents can negotiate, share information, and coordinate their actions to optimize performance. The evolution of MAS has been driven by advancements in machine learning, communication protocols, and computational power, enabling the development of more sophisticated and capable agents.\n",
+ "Multi-agent systems have found applications in diverse fields, including robotics, where they enable coordinated movements of robotic swarms, and healthcare, where they assist in patient monitoring and treatment planning. Successful implementations, such as autonomous vehicles and smart grid management, demonstrate the potential of MAS to enhance efficiency and effectiveness in complex task execution. The integration of MAS with emerging technologies, such as the Internet of Things (IoT) and artificial intelligence (AI), opens new avenues for innovation and problem-solving. Future trends indicate a growing reliance on MAS in various sectors, driven by the need for intelligent and adaptive systems.\n",
"\n",
- "## 3. Methodology\n",
- "The methodology for implementing a multi-agent architecture involves several key steps. First, the task must be decomposed into manageable subtasks that can be assigned to individual agents. This decomposition is often guided by the capabilities and expertise of the agents involved. Next, agents must be designed to communicate effectively, sharing information and coordinating their actions to achieve the overall goal. The use of protocols such as Contract Net Protocol (CNP) and Auction-based mechanisms can facilitate this communication. Additionally, agents must be equipped with learning algorithms that allow them to adapt to changing environments and improve their performance over time. Finally, the system must be evaluated to assess its effectiveness in solving the complex task, often using metrics such as completion time, resource utilization, and accuracy.\n",
+ "### 5. Challenges \n",
"\n",
- "## 4. Applications\n",
- "Multi-agent systems have a wide range of applications across various domains. In robotics, MAS can be employed for tasks such as coordinated exploration, where multiple robots work together to map an unknown environment. In healthcare, agents can assist in patient management by coordinating care among different providers and ensuring that patients receive timely interventions. Financial markets also benefit from MAS, where agents can analyze market data and execute trades based on collective intelligence. Furthermore, in smart cities, multi-agent systems can optimize traffic flow and energy consumption by coordinating the actions of various infrastructure components. Each of these applications demonstrates the versatility and effectiveness of multi-agent architecture in addressing complex challenges.\n",
+ "Despite their advantages, the development and deployment of multi-agent systems face several challenges. Coordination among agents can be difficult, especially in dynamic environments where tasks and conditions change rapidly. Communication issues, such as information overload and misinterpretation, can hinder agent collaboration. Ethical considerations, including privacy and security concerns, must be addressed to ensure responsible use of MAS. Technical limitations, such as scalability and computational complexity, pose additional hurdles. Strategies for overcoming these challenges include the development of robust algorithms, improved communication protocols, and ethical guidelines for agent behavior.\n",
"\n",
- "## 5. Challenges\n",
- "Despite the advantages of multi-agent systems, several challenges must be addressed to ensure their successful implementation. One significant challenge is the issue of agent coordination, particularly in dynamic environments where tasks and conditions may change rapidly. Ensuring that agents can adapt to these changes while maintaining effective communication is crucial. Additionally, the complexity of designing agents that can operate autonomously while adhering to ethical guidelines poses another challenge. Security concerns, such as the potential for agents to be manipulated or compromised, must also be considered. Finally, the integration of multi-agent systems with existing technologies and infrastructures can present technical hurdles that need to be overcome.\n",
+ "### 6. Conclusions \n",
"\n",
- "## 6. Conclusions\n",
- "In conclusion, multi-agent architecture offers a powerful framework for addressing complex tasks across various domains. By leveraging the capabilities of multiple autonomous agents, systems can achieve greater efficiency, adaptability, and problem-solving capabilities. However, the successful implementation of MAS requires careful consideration of coordination, ethical guidelines, security, and integration challenges. Future research should focus on developing robust frameworks and methodologies that enhance the effectiveness of multi-agent systems in real-world applications. As technology continues to evolve, the potential for multi-agent systems to revolutionize complex task performance remains significant.\n",
+ "In conclusion, multi-agent architecture represents a powerful approach to solving complex tasks through collaboration and autonomy. The findings of this report highlight the significance of MAS in various applications and the potential for future research to address existing challenges. As technology continues to evolve, the impact of MAS on society and industry will likely grow, necessitating further exploration and innovation in this field. Researchers and practitioners are encouraged to delve deeper into the capabilities of MAS, fostering advancements that can lead to more efficient and effective solutions to real-world problems.\n",
"\n",
"## References\n",
- "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. Microsoft Research. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
- "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. arXiv. Retrieved from [arXiv](https://arxiv.org/abs/2411.04468)\n",
- "- Microsoft Tech Community. (2023). Ragentic: RAG Enhanced Multi-Agent Architecture. Retrieved from [Microsoft Tech Community](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132)\n",
+ "\n",
+ "- Fourney, A., Bansal, G., Mozannar, H., Dibia, V., & Amershi, S. (2024). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. *Microsoft Research*. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/articles/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
+ "- Sharifi, N. (2024). Building a Multi-Agent System to Accomplish Complex Tasks. *Towards AI*. Retrieved from [Towards AI](https://pub.towardsai.net/building-a-multi-agent-system-to-accomplish-complex-tasks-812aeedda4eb)\n",
+ "- Pimentel, S. (2024). Architectures for AI Agents: From Basic to Multi-Agent Systems. *DragonScale AI Blog*. Retrieved from [DragonScale AI](https://blog.dragonscale.ai/architectures-for-ai-agents/)\n",
" file_name: multi_agent_architecture_report.md\n",
"==================================================\n",
"\n",
@@ -4018,7 +2101,7 @@
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"\n",
- "The report on \"Multi-Agent Architecture for Complex Task Performance\" has been successfully created and saved as a markdown file titled **multi_agent_architecture_report.md**. If you need any further modifications or additional content, feel free to ask!\n",
+ "The report on \"Multi-Agent Architecture for Complex Task Execution\" has been successfully created and saved as `multi_agent_architecture_report.md`. You can access it for further review or modifications.\n",
"==================================================\n",
"\n",
"==================================================\n",
@@ -4027,7 +2110,7 @@
"================================\u001b[1m Human Message \u001b[0m=================================\n",
"Name: DocWriter\n",
"\n",
- "The report on \"Multi-Agent Architecture for Complex Task Performance\" has been successfully created and saved as a markdown file titled **multi_agent_architecture_report.md**. If you need any further modifications or additional content, feel free to ask!\n",
+ "The report on \"Multi-Agent Architecture for Complex Task Execution\" has been successfully created and saved as `multi_agent_architecture_report.md`. You can access it for further review or modifications.\n",
"==================================================\n",
"\n",
"==================================================\n",
@@ -4042,10 +2125,10 @@
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"Tool Calls:\n",
- " create_outline (call_hZRhn0zbVWUiqWMslllM325i)\n",
- " Call ID: call_hZRhn0zbVWUiqWMslllM325i\n",
+ " create_outline (call_FlgpKOmjdTSg3IEi43mJhB2Z)\n",
+ " Call ID: call_FlgpKOmjdTSg3IEi43mJhB2Z\n",
" Args:\n",
- " points: ['Abstract', '1. Introduction', '2. Background', ' 2.1 Definition of Multi-Agent Systems', ' 2.2 Key Theories Underpinning MAS', ' 2.3 Evolution of MAS', '3. Methodology', ' 3.1 Task Decomposition', ' 3.2 Agent Communication', ' 3.3 Learning Algorithms', ' 3.4 Evaluation Metrics', '4. Applications', ' 4.1 Robotics', ' 4.2 Healthcare', ' 4.3 Financial Markets', ' 4.4 Smart Cities', '5. Challenges', ' 5.1 Agent Coordination', ' 5.2 Ethical Considerations', ' 5.3 Security Concerns', ' 5.4 Integration with Existing Technologies', '6. Conclusions', 'References']\n",
+ " points: ['Introduction', 'Background', 'Methodology', 'Applications', 'Challenges', 'Conclusions', 'References']\n",
" file_name: multi_agent_architecture_outline.md\n",
"==================================================\n",
"\n",
@@ -4063,7 +2146,7 @@
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"\n",
- "The outline for the report on \"Multi-Agent Architecture for Complex Task Performance\" has been successfully created and saved as **multi_agent_architecture_outline.md**. If you need any further modifications or additional content, feel free to ask!\n",
+ "The outline for the report on \"Multi-Agent Architecture for Complex Task Execution\" has been successfully created and saved as `multi_agent_architecture_outline.md`. You can access it for further review or modifications.\n",
"==================================================\n",
"\n",
"==================================================\n",
@@ -4072,7 +2155,7 @@
"================================\u001b[1m Human Message \u001b[0m=================================\n",
"Name: NoteTaker\n",
"\n",
- "The outline for the report on \"Multi-Agent Architecture for Complex Task Performance\" has been successfully created and saved as **multi_agent_architecture_outline.md**. If you need any further modifications or additional content, feel free to ask!\n",
+ "The outline for the report on \"Multi-Agent Architecture for Complex Task Execution\" has been successfully created and saved as `multi_agent_architecture_outline.md`. You can access it for further review or modifications.\n",
"==================================================\n",
"\n",
"==================================================\n",
@@ -4087,7 +2170,7 @@
"- - - - - - - - - - - - - - - - - - - - - - - - - \n",
"==================================\u001b[1m Ai Message \u001b[0m==================================\n",
"\n",
- "It seems that both the report and the outline for \"Multi-Agent Architecture for Complex Task Performance\" have been successfully created and saved. If you need any further modifications, additional content, or assistance with anything else, please let me know!\n",
+ "The report on \"Multi-Agent Architecture for Complex Task Execution\" has been successfully created and saved as `multi_agent_architecture_report.md`. Additionally, the outline for the report has been saved as `multi_agent_architecture_outline.md`. You can access both documents for further review or modifications.\n",
"==================================================\n",
"\n",
"==================================================\n",
@@ -4096,7 +2179,7 @@
"================================\u001b[1m Human Message \u001b[0m=================================\n",
"Name: ChartGenerator\n",
"\n",
- "It seems that both the report and the outline for \"Multi-Agent Architecture for Complex Task Performance\" have been successfully created and saved. If you need any further modifications, additional content, or assistance with anything else, please let me know!\n",
+ "The report on \"Multi-Agent Architecture for Complex Task Execution\" has been successfully created and saved as `multi_agent_architecture_report.md`. Additionally, the outline for the report has been saved as `multi_agent_architecture_outline.md`. You can access both documents for further review or modifications.\n",
"==================================================\n",
"\n",
"==================================================\n",
@@ -4112,7 +2195,7 @@
"================================\u001b[1m Human Message \u001b[0m=================================\n",
"Name: ChartGenerator\n",
"\n",
- "It seems that both the report and the outline for \"Multi-Agent Architecture for Complex Task Performance\" have been successfully created and saved. If you need any further modifications, additional content, or assistance with anything else, please let me know!\n",
+ "The report on \"Multi-Agent Architecture for Complex Task Execution\" has been successfully created and saved as `multi_agent_architecture_report.md`. Additionally, the outline for the report has been saved as `multi_agent_architecture_outline.md`. You can access both documents for further review or modifications.\n",
"==================================================\n",
"\n",
"==================================================\n",
@@ -4154,39 +2237,96 @@
},
{
"cell_type": "code",
- "execution_count": 63,
- "metadata": {},
+ "execution_count": 28,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 1000
+ },
+ "id": "8fQztrh7dU2c",
+ "outputId": "cefd14b0-94e8-45a4-d438-bf1f62b67b05"
+ },
"outputs": [
{
"data": {
"text/markdown": [
- "# Report on Multi-Agent Architecture for Complex Task Performance\n",
+ "# Report on Multi-Agent Architecture for Complex Task Execution\n",
+ "\n",
+ "## Outline\n",
+ "\n",
+ "1. **Introduction** \n",
+ " - Definition of multi-agent systems (MAS) and their significance in solving complex tasks. \n",
+ " - Overview of the evolution of MAS and their applications in various fields. \n",
+ " - Importance of collaboration among agents in achieving task objectives. \n",
+ " - Brief mention of the structure of the report and what each section will cover. \n",
+ " - Statement of the report's objectives and the relevance of the topic in current research.\n",
+ "\n",
+ "2. **Background** \n",
+ " - Historical context of multi-agent systems and their development. \n",
+ " - Key concepts in MAS, including agent autonomy, communication, and cooperation. \n",
+ " - Overview of different types of agents and their roles in MAS. \n",
+ " - Discussion of the theoretical frameworks that underpin MAS, such as game theory and distributed systems. \n",
+ " - Summary of existing literature and research on MAS applications.\n",
+ "\n",
+ "3. **Methodology** \n",
+ " - Description of the design and implementation of a multi-agent architecture. \n",
+ " - Explanation of task decomposition and agent specialization. \n",
+ " - Overview of communication protocols and mechanisms used in MAS. \n",
+ " - Discussion of evaluation metrics for assessing the performance of MAS. \n",
+ " - Case studies illustrating the application of the methodology in real-world scenarios.\n",
+ "\n",
+ "4. **Applications** \n",
+ " - Exploration of various domains where MAS can be applied, such as robotics, healthcare, and smart cities. \n",
+ " - Detailed examples of successful MAS implementations in industry and research. \n",
+ " - Discussion of how MAS can enhance efficiency and effectiveness in complex task execution. \n",
+ " - Analysis of the role of MAS in emerging technologies, such as AI and IoT. \n",
+ " - Future trends and potential areas for further research in MAS applications.\n",
"\n",
- "## Abstract\n",
- "This report explores the utilization of multi-agent architecture in performing complex tasks. It provides a comprehensive overview of the theoretical foundations, methodologies, applications, and challenges associated with multi-agent systems (MAS). The report also includes case studies and examples that illustrate the practical applications of MAS in real-world scenarios. By examining the capabilities and limitations of multi-agent systems, this report aims to contribute to the understanding of how these systems can be effectively employed to solve intricate problems across various domains.\n",
+ "5. **Challenges** \n",
+ " - Identification of common challenges faced in the development and deployment of MAS. \n",
+ " - Discussion of issues related to agent coordination, communication, and conflict resolution. \n",
+ " - Examination of ethical considerations and safety concerns in MAS. \n",
+ " - Overview of technical limitations and scalability issues. \n",
+ " - Strategies for overcoming these challenges and improving MAS performance.\n",
"\n",
- "## 1. Introduction\n",
- "Multi-agent architecture refers to a system composed of multiple interacting intelligent agents, each capable of autonomous decision-making. The increasing complexity of tasks in various fields, such as robotics, healthcare, and finance, necessitates the development of sophisticated systems that can operate collaboratively. This report aims to elucidate the principles of multi-agent systems and their application in addressing complex tasks. The significance of this study lies in its potential to enhance efficiency, adaptability, and problem-solving capabilities in dynamic environments. By leveraging the strengths of individual agents, multi-agent systems can achieve outcomes that surpass the capabilities of single-agent systems.\n",
+ "6. **Conclusions** \n",
+ " - Summary of key findings from the report. \n",
+ " - Reflection on the significance of multi-agent architecture in solving complex tasks. \n",
+ " - Recommendations for future research directions in MAS. \n",
+ " - Final thoughts on the potential impact of MAS on society and technology. \n",
+ " - Call to action for researchers and practitioners to explore MAS further.\n",
"\n",
- "## 2. Background\n",
- "The concept of multi-agent systems has its roots in artificial intelligence and distributed computing. Agents in a multi-agent system can be defined as autonomous entities that perceive their environment and act upon it to achieve specific goals. The interactions among agents can be cooperative, competitive, or a mix of both, depending on the task at hand. Key theories underpinning MAS include game theory, distributed problem-solving, and social choice theory. These theories provide insights into how agents can negotiate, share information, and coordinate their actions to optimize performance. The evolution of MAS has been driven by advancements in machine learning, communication protocols, and computational power, enabling the development of more sophisticated and capable agents.\n",
+ "## Detailed Content\n",
"\n",
- "## 3. Methodology\n",
- "The methodology for implementing a multi-agent architecture involves several key steps. First, the task must be decomposed into manageable subtasks that can be assigned to individual agents. This decomposition is often guided by the capabilities and expertise of the agents involved. Next, agents must be designed to communicate effectively, sharing information and coordinating their actions to achieve the overall goal. The use of protocols such as Contract Net Protocol (CNP) and Auction-based mechanisms can facilitate this communication. Additionally, agents must be equipped with learning algorithms that allow them to adapt to changing environments and improve their performance over time. Finally, the system must be evaluated to assess its effectiveness in solving the complex task, often using metrics such as completion time, resource utilization, and accuracy.\n",
+ "### 1. Introduction \n",
"\n",
- "## 4. Applications\n",
- "Multi-agent systems have a wide range of applications across various domains. In robotics, MAS can be employed for tasks such as coordinated exploration, where multiple robots work together to map an unknown environment. In healthcare, agents can assist in patient management by coordinating care among different providers and ensuring that patients receive timely interventions. Financial markets also benefit from MAS, where agents can analyze market data and execute trades based on collective intelligence. Furthermore, in smart cities, multi-agent systems can optimize traffic flow and energy consumption by coordinating the actions of various infrastructure components. Each of these applications demonstrates the versatility and effectiveness of multi-agent architecture in addressing complex challenges.\n",
+ "Multi-agent systems (MAS) are defined as systems composed of multiple interacting intelligent agents, capable of autonomous decision-making and task execution. The significance of MAS lies in their ability to collaboratively solve complex tasks that are beyond the capabilities of individual agents. Over the years, MAS have evolved from simple rule-based systems to sophisticated architectures that leverage advanced algorithms and machine learning techniques. The collaboration among agents is crucial, as it allows for the distribution of tasks, parallel processing, and the pooling of resources and knowledge. This report aims to provide a comprehensive overview of multi-agent architecture, focusing on its methodology, applications, challenges, and future directions.\n",
"\n",
- "## 5. Challenges\n",
- "Despite the advantages of multi-agent systems, several challenges must be addressed to ensure their successful implementation. One significant challenge is the issue of agent coordination, particularly in dynamic environments where tasks and conditions may change rapidly. Ensuring that agents can adapt to these changes while maintaining effective communication is crucial. Additionally, the complexity of designing agents that can operate autonomously while adhering to ethical guidelines poses another challenge. Security concerns, such as the potential for agents to be manipulated or compromised, must also be considered. Finally, the integration of multi-agent systems with existing technologies and infrastructures can present technical hurdles that need to be overcome.\n",
+ "### 2. Background \n",
"\n",
- "## 6. Conclusions\n",
- "In conclusion, multi-agent architecture offers a powerful framework for addressing complex tasks across various domains. By leveraging the capabilities of multiple autonomous agents, systems can achieve greater efficiency, adaptability, and problem-solving capabilities. However, the successful implementation of MAS requires careful consideration of coordination, ethical guidelines, security, and integration challenges. Future research should focus on developing robust frameworks and methodologies that enhance the effectiveness of multi-agent systems in real-world applications. As technology continues to evolve, the potential for multi-agent systems to revolutionize complex task performance remains significant.\n",
+ "The historical context of multi-agent systems dates back to the early days of artificial intelligence, where researchers began exploring the potential of autonomous agents. Key concepts in MAS include agent autonomy, which refers to the ability of agents to operate independently, and communication, which is essential for coordination among agents. Different types of agents, such as reactive, deliberative, and hybrid agents, play distinct roles in MAS, contributing to their overall functionality. Theoretical frameworks, including game theory and distributed systems, provide the foundation for understanding agent interactions and decision-making processes. A review of existing literature reveals a growing interest in MAS applications across various domains, highlighting their versatility and effectiveness.\n",
+ "\n",
+ "### 3. Methodology \n",
+ "\n",
+ "The design and implementation of a multi-agent architecture involve several key steps, including task decomposition, where complex tasks are broken down into manageable subtasks assigned to specialized agents. Communication protocols, such as publish-subscribe mechanisms, facilitate information exchange among agents, ensuring that they remain informed about relevant developments. Evaluation metrics, such as task completion time and resource utilization, are essential for assessing the performance of MAS. Case studies, such as the deployment of MAS in disaster response scenarios, illustrate the practical application of these methodologies, showcasing how agents can work together to achieve common goals.\n",
+ "\n",
+ "### 4. Applications \n",
+ "\n",
+ "Multi-agent systems have found applications in diverse fields, including robotics, where they enable coordinated movements of robotic swarms, and healthcare, where they assist in patient monitoring and treatment planning. Successful implementations, such as autonomous vehicles and smart grid management, demonstrate the potential of MAS to enhance efficiency and effectiveness in complex task execution. The integration of MAS with emerging technologies, such as the Internet of Things (IoT) and artificial intelligence (AI), opens new avenues for innovation and problem-solving. Future trends indicate a growing reliance on MAS in various sectors, driven by the need for intelligent and adaptive systems.\n",
+ "\n",
+ "### 5. Challenges \n",
+ "\n",
+ "Despite their advantages, the development and deployment of multi-agent systems face several challenges. Coordination among agents can be difficult, especially in dynamic environments where tasks and conditions change rapidly. Communication issues, such as information overload and misinterpretation, can hinder agent collaboration. Ethical considerations, including privacy and security concerns, must be addressed to ensure responsible use of MAS. Technical limitations, such as scalability and computational complexity, pose additional hurdles. Strategies for overcoming these challenges include the development of robust algorithms, improved communication protocols, and ethical guidelines for agent behavior.\n",
+ "\n",
+ "### 6. Conclusions \n",
+ "\n",
+ "In conclusion, multi-agent architecture represents a powerful approach to solving complex tasks through collaboration and autonomy. The findings of this report highlight the significance of MAS in various applications and the potential for future research to address existing challenges. As technology continues to evolve, the impact of MAS on society and industry will likely grow, necessitating further exploration and innovation in this field. Researchers and practitioners are encouraged to delve deeper into the capabilities of MAS, fostering advancements that can lead to more efficient and effective solutions to real-world problems.\n",
"\n",
"## References\n",
- "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. Microsoft Research. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/publication/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
- "- Fourney, A., et al. (2023). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. arXiv. Retrieved from [arXiv](https://arxiv.org/abs/2411.04468)\n",
- "- Microsoft Tech Community. (2023). Ragentic: RAG Enhanced Multi-Agent Architecture. Retrieved from [Microsoft Tech Community](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/ragentic-rag-enhanced-multi-agent-architecture/4287132)"
+ "\n",
+ "- Fourney, A., Bansal, G., Mozannar, H., Dibia, V., & Amershi, S. (2024). Magentic-One: A Generalist Multi-Agent System for Solving Complex Tasks. *Microsoft Research*. Retrieved from [Microsoft Research](https://www.microsoft.com/en-us/research/articles/magentic-one-a-generalist-multi-agent-system-for-solving-complex-tasks/)\n",
+ "- Sharifi, N. (2024). Building a Multi-Agent System to Accomplish Complex Tasks. *Towards AI*. Retrieved from [Towards AI](https://pub.towardsai.net/building-a-multi-agent-system-to-accomplish-complex-tasks-812aeedda4eb)\n",
+ "- Pimentel, S. (2024). Architectures for AI Agents: From Basic to Multi-Agent Systems. *DragonScale AI Blog*. Retrieved from [DragonScale AI](https://blog.dragonscale.ai/architectures-for-ai-agents/)"
],
"text/plain": [
""
@@ -4206,6 +2346,15 @@
"with open(md_file, \"r\", encoding=\"utf-8\") as f:\n",
" display(Markdown(f.read()))"
]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "e3EKA-dUfPjR"
+ },
+ "outputs": [],
+ "source": []
}
],
"metadata": {