Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
12 views27 pages

Expert System

An expert system is an AI program designed to solve complex problems and assist in decision-making by utilizing a knowledge base and inference rules. It operates within specific domains, such as medicine or finance, and is not intended to replace human experts but to enhance their capabilities. Key components include a user interface, inference engine, and knowledge base, which collectively enable the system to provide reliable, efficient, and accurate solutions to user queries.

Uploaded by

coupscakebaby
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views27 pages

Expert System

An expert system is an AI program designed to solve complex problems and assist in decision-making by utilizing a knowledge base and inference rules. It operates within specific domains, such as medicine or finance, and is not intended to replace human experts but to enhance their capabilities. Key components include a user interface, inference engine, and knowledge base, which collectively enable the system to provide reliable, efficient, and accurate solutions to user queries.

Uploaded by

coupscakebaby
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

What is an Expert System?

An expert system is a computer program that is designed to solve complex problems


and to provide decision-making ability like a human expert. It performs this by
extracting knowledge from its knowledge base using the reasoning and inference
rules according to the user queries.

The expert system is a part of AI, and the first ES was developed in the year 1970,
which was the first successful approach of artificial intelligence. It solves the most
complex issue as an expert by extracting the knowledge stored in its knowledge
base. The system helps in decision making for complex problems using both facts
and heuristics like a human expert. It is called so because it contains the expert
knowledge of a specific domain and can solve any complex problem of that
particular domain. These systems are designed for a specific domain, such
as medicine, science, etc.

The performance of an expert system is based on the expert's knowledge stored in


its knowledge base. The more knowledge stored in the KB, the more that system
improves its performance. One of the common examples of an ES is a suggestion of
spelling errors while typing in the Google search box.

Below is the block diagram that represents the working of an expert system:

Backward Skip 10sPlay VideoForward Skip 10s

Note: It is important to remember that an expert system is not used to replace the
human experts; instead, it is used to assist the human in making a complex decision.
These systems do not have human capabilities of thinking and work on the basis of the
knowledge base of the particular domain.

Below are some popular examples of the Expert System:


o DENDRAL: It was an artificial intelligence project that was made as a chemical
analysis expert system. It was used in organic chemistry to detect unknown
organic molecules with the help of their mass spectra and knowledge base of
chemistry.
o MYCIN: It was one of the earliest backward chaining expert systems that was
designed to find the bacteria causing infections like bacteraemia and
meningitis. It was also used for the recommendation of antibiotics and the
diagnosis of blood clotting diseases.
o PXDES: It is an expert system that is used to determine the type and level of
lung cancer. To determine the disease, it takes a picture from the upper body,
which looks like the shadow. This shadow identifies the type and degree of
harm.
o CaDeT: The CaDet expert system is a diagnostic support system that can
detect cancer at early stages.

Characteristics of Expert System

o High Performance: The expert system provides high performance for solving
any type of complex problem of a specific domain with high efficiency and
accuracy.
o Understandable: It responds in a way that can be easily understandable by
the user. It can take input in human language and provides the output in the
same way.
o Reliable: It is much reliable for generating an efficient and accurate output.
o Highly responsive: ES provides the result for any complex query within a very
short period of time.

Components of Expert System


An expert system mainly consists of three components:

o User Interface
o Inference Engine
o Knowledge Base
1. User Interface

With the help of a user interface, the expert system interacts with the user, takes
queries as an input in a readable format, and passes it to the inference engine. After
getting the response from the inference engine, it displays the output to the user. In
other words, it is an interface that helps a non-expert user to communicate with
the expert system to find a solution.

2. Inference Engine(Rules of Engine)

o The inference engine is known as the brain of the expert system as it is the
main processing unit of the system. It applies inference rules to the
knowledge base to derive a conclusion or deduce new information. It helps in
deriving an error-free solution of queries asked by the user.
o With the help of an inference engine, the system extracts the knowledge from
the knowledge base.
o There are two types of inference engine:
o Deterministic Inference engine: The conclusions drawn from this type of
inference engine are assumed to be true. It is based on facts and rules.
o Probabilistic Inference engine: This type of inference engine contains
uncertainty in conclusions, and based on the probability.
Inference engine uses the below modes to derive the solutions:

o Forward Chaining: It starts from the known facts and rules, and applies the
inference rules to add their conclusion to the known facts.
o Backward Chaining: It is a backward reasoning method that starts from the
goal and works backward to prove the known facts.

3. Knowledge Base

o The knowledgebase is a type of storage that stores knowledge acquired from


the different experts of the particular domain. It is considered as big storage
of knowledge. The more the knowledge base, the more precise will be the
Expert System.
o It is similar to a database that contains information and rules of a particular
domain or subject.
o One can also view the knowledge base as collections of objects and their
attributes. Such as a Lion is an object and its attributes are it is a mammal, it is
not a domestic animal, etc.

Components of Knowledge Base

o Factual Knowledge: The knowledge which is based on facts and accepted by


knowledge engineers comes under factual knowledge.
o Heuristic Knowledge: This knowledge is based on practice, the ability to
guess, evaluation, and experiences.

Knowledge Representation: It is used to formalize the knowledge stored in the


knowledge base using the If-else rules.

Knowledge Acquisitions: It is the process of extracting, organizing, and structuring


the domain knowledge, specifying the rules to acquire the knowledge from various
experts, and store that knowledge into the knowledge base.

Development of Expert System

Here, we will explain the working of an expert system by taking an example of


MYCIN ES. Below are some steps to build an MYCIN:

o Firstly, ES should be fed with expert knowledge. In the case of MYCIN, human
experts specialized in the medical field of bacterial infection, provide
information about the causes, symptoms, and other knowledge in that
domain.
o The KB of the MYCIN is updated successfully. In order to test it, the doctor
provides a new problem to it. The problem is to identify the presence of the
bacteria by inputting the details of a patient, including the symptoms, current
condition, and medical history.
o The ES will need a questionnaire to be filled by the patient to know the
general information about the patient, such as gender, age, etc.
o Now the system has collected all the information, so it will find the solution
for the problem by applying if-then rules using the inference engine and using
the facts stored within the KB.
o In the end, it will provide a response to the patient by using the user interface.

Participants in the development of Expert System

There are three primary participants in the building of Expert System:

1. Expert: The success of an ES much depends on the knowledge provided by


human experts. These experts are those persons who are specialized in that
specific domain.
2. Knowledge Engineer: Knowledge engineer is the person who gathers the
knowledge from the domain experts and then codifies that knowledge to the
system according to the formalism.
3. End-User: This is a particular person or a group of people who may not be
experts, and working on the expert system needs the solution or advice for his
queries, which are complex.

Why Expert System?

Before using any technology, we must have an idea about why to use that
technology and hence the same for the ES. Although we have human experts in
every field, then what is the need to develop a computer-based system. So below are
the points that are describing the need of the ES:

1. No memory Limitations: It can store as much data as required and can


memorize it at the time of its application. But for human experts, there are
some limitations to memorize all things at every time.
2. High Efficiency: If the knowledge base is updated with the correct
knowledge, then it provides a highly efficient output, which may not be
possible for a human.
3. Expertise in a domain: There are lots of human experts in each domain, and
they all have different skills, different experiences, and different skills, so it is
not easy to get a final output for the query. But if we put the knowledge
gained from human experts into the expert system, then it provides an
efficient output by mixing all the facts and knowledge
4. Not affected by emotions: These systems are not affected by human
emotions such as fatigue, anger, depression, anxiety, etc.. Hence the
performance remains constant.
5. High security: These systems provide high security to resolve any query.
6. Considers all the facts: To respond to any query, it checks and considers all
the available facts and provides the result accordingly. But it is possible that a
human expert may not consider some facts due to any reason.
7. Regular updates improve the performance: If there is an issue in the result
provided by the expert systems, we can improve the performance of the
system by updating the knowledge base.

Capabilities of the Expert System


Below are some capabilities of an Expert System:

o Advising: It is capable of advising the human being for the query of any
domain from the particular ES.
o Provide decision-making capabilities: It provides the capability of decision
making in any domain, such as for making any financial decision, decisions in
medical science, etc.
o Demonstrate a device: It is capable of demonstrating any new products such
as its features, specifications, how to use that product, etc.
o Problem-solving: It has problem-solving capabilities.
o Explaining a problem: It is also capable of providing a detailed description of
an input problem.
o Interpreting the input: It is capable of interpreting the input given by the
user.
o Predicting results: It can be used for the prediction of a result.
o Diagnosis: An ES designed for the medical field is capable of diagnosing a
disease without using multiple components as it already contains various
inbuilt medical tools.

Advantages of Expert System

o These systems are highly reproducible.


o They can be used for risky places where the human presence is not safe.
o Error possibilities are less if the KB contains correct knowledge.
o The performance of these systems remains steady as it is not affected by
emotions, tension, or fatigue.
o They provide a very high speed to respond to a particular query.

Limitations of Expert System

o The response of the expert system may get wrong if the knowledge base
contains the wrong information.
o Like a human being, it cannot produce a creative output for different
scenarios.
o Its maintenance and development costs are very high.
o Knowledge acquisition for designing is much difficult.
o For each domain, we require a specific ES, which is one of the big limitations.
o It cannot learn from itself and hence requires manual updates.

Applications of Expert System

o In designing and manufacturing domain


It can be broadly used for designing and manufacturing physical devices such
as camera lenses and automobiles.
o In the knowledge domain
These systems are primarily used for publishing the relevant knowledge to the
users. The two popular ES used for this domain is an advisor and a tax advisor.
o In the finance domain
In the finance industries, it is used to detect any type of possible fraud,
suspicious activity, and advise bankers that if they should provide loans for
business or not.
o In the diagnosis and troubleshooting of devices
In medical diagnosis, the ES system is used, and it was the first area where
these systems were used.
o Planning and Scheduling
The expert systems can also be used for planning and scheduling some
particular tasks for achieving the goal of that task.

Expert Systems
Artificial Intelligence is a piece of software that simulates the behaviour and
judgement of a human or an organization that has experts in a particular domain is
known as an expert system. It does this by acquiring relevant knowledge from its
knowledge base and interpreting it according to the user’s problem. The data in the
knowledge base is added by humans that are expert in a particular domain and this
software is used by a non-expert user to acquire some information. It is widely used
in many areas such as medical diagnosis, accounting, coding, games etc.
An expert system is AI software that uses knowledge stored in a knowledge base to
solve problems that would usually require a human expert thus preserving a human
expert’s knowledge in its knowledge base. They can advise users as well as provide
explanations to them about how they reached a particular conclusion or
advice. Knowledge Engineering is the term used to define the process of building
an Expert System and its practitioners are called Knowledge Engineers. The
primary role of a knowledge engineer is to make sure that the computer possesses all
the knowledge required to solve a problem. The knowledge engineer must choose
one or more forms in which to represent the required knowledge as a symbolic
pattern in the memory of the computer.
Example : There are many examples of an expert system. Some of them are given
below –
 MYCIN –
One of the earliest expert systems based on backward chaining. It can
identify various bacteria that can cause severe infections and can also
recommend drugs based on the person’s weight.
 DENDRAL –
It was an artificial intelligence-based expert system used for chemical
analysis. It used a substance’s spectrographic data to predict its molecular
structure.
 R1/XCON –
It could select specific software to generate a computer system wished by
the user.
 PXDES –
It could easily determine the type and the degree of lung cancer in a
patient based on the data.
 CaDet –
It is a clinical support system that could identify cancer in its early stages
in patients.
 DXplain –
It was also a clinical support system that could suggest a variety of
diseases based on the findings of the doctor.
Components of an Expert System :

Architecture of an Expert System

Knowledge Base –
The knowledge base represents facts and rules. It consists of knowledge in
a particular domain as well as rules to solve a problem, procedures and
intrinsic data relevant to the domain.
 Inference Engine –
The function of the inference engine is to fetch the relevant knowledge
from the knowledge base, interpret it and to find a solution relevant to the
user’s problem. The inference engine acquires the rules from its
knowledge base and applies them to the known facts to infer new facts.
Inference engines can also include an explanation and debugging abilities.
 Knowledge Acquisition and Learning Module –
The function of this component is to allow the expert system to acquire
more and more knowledge from various sources and store it in the
knowledge base.
 User Interface –
This module makes it possible for a non-expert user to interact with the
expert system and find a solution to the problem.
 Explanation Module –
This module helps the expert system to give the user an explanation about
how the expert system reached a particular conclusion.
The Inference Engine generally uses two strategies for acquiring knowledge from
the Knowledge Base, namely –
 Forward Chaining
 Backward Chaining
Forward Chaining –
Forward Chaining is a strategic process used by the Expert System to answer the
questions – What will happen next. This strategy is mostly used for managing tasks
like creating a conclusion, result or effect. Example – prediction or share market
movement status.

Forward Chaining

Backward Chaining –
Backward Chaining is a strategy used by the Expert System to answer the questions
– Why this has happened. This strategy is mostly used to find out the root cause or
reason behind it, considering what has already happened. Example – diagnosis of
stomach pain, blood cancer or dengue, etc.

Backward Chaining

Characteristics of an Expert System :


 Human experts are perishable, but an expert system is permanent.
 It helps to distribute the expertise of a human.
 One expert system may contain knowledge from more than one human
experts thus making the solutions more efficient.
 It decreases the cost of consulting an expert for various domains such as
medical diagnosis.
 They use a knowledge base and inference engine.
 Expert systems can solve complex problems by deducing new facts
through existing facts of knowledge, represented mostly as if-then rules
rather than through conventional procedural code.
 Expert systems were among the first truly successful forms of artificial
intelligence (AI) software.
Limitations :
 Do not have human-like decision-making power.
 Cannot possess human capabilities.
 Cannot produce correct result from less amount of knowledge.
 Requires excessive training.
Advantages :
 Low accessibility cost.
 Fast response.
 Not affected by emotions, unlike humans.
 Low error rate.
 Capable of explaining how they reached a solution.
Disadvantages :
 The expert system has no emotions.
 Common sense is the main issue of the expert system.
 It is developed for a specific domain.
 It needs to be updated manually. It does not learn itself.
 Not capable to explain the logic behind the decision.
Applications :
The application of an expert system can be found in almost all areas of business or
government. They include areas such as –
 Different types of medical diagnosis like internal medicine, blood diseases
and show on.
 Diagnosis of the complex electronic and electromechanical system.
 Diagnosis of a software development project.
 Planning experiment in biology, chemistry and molecular genetics.
 Forecasting crop damage.
 Diagnosis of the diesel-electric locomotive system.
 Identification of chemical compound structure.
 Scheduling of customer order, computer resources and various
manufacturing task.
 Assessment of geologic structure from dip meter logs.
 Assessment of space structure through satellite and robot.
 The design of VLSI system.
 Teaching students specialize task.
 Assessment of log including civil case evaluation, product liability etc.
Expert systems have evolved so much that they have started various debates about
the fate of humanity in the face of such intelligence, with authors such as Nick
Bostrom (Professor of Philosophy at Oxford University), pondering if computing
power has transcended our ability to control it.

What is NLP?
NLP stands for Natural Language Processing, which is a part of Computer Science,
Human language, and Artificial Intelligence. It is the technology that is used by
machines to understand, analyse, manipulate, and interpret human's languages. It
helps developers to organize knowledge for performing tasks such as translation,
automatic summarization, Named Entity Recognition (NER), speech
recognition, relationship extraction, and topic segmentation.

History of NLP
(1940-1960) - Focused on Machine Translation (MT)

The Natural Languages Processing started in the year 1940s.


1948 - In the Year 1948, the first recognisable NLP application was introduced in
Birkbeck College, London.

1950s - In the Year 1950s, there was a conflicting view between linguistics and
computer science. Now, Chomsky developed his first book syntactic structures and
claimed that language is generative in nature.

In 1957, Chomsky also introduced the idea of Generative Grammar, which is rule
based descriptions of syntactic structures.

(1960-1980) - Flavored with Artificial Intelligence (AI)

In the year 1960 to 1980, the key developments were:

Augmented Transition Networks (ATN)

Augmented Transition Networks is a finite state machine that is capable of


recognizing regular languages.

Case Grammar

Case Grammar was developed by Linguist Charles J. Fillmore in the year 1968. Case
Grammar uses languages such as English to express the relationship between nouns
and verbs by using the preposition.

In Case Grammar, case roles can be defined to link certain kinds of verbs and objects.

For example: "Neha broke the mirror with the hammer". In this example case
grammar identify Neha as an agent, mirror as a theme, and hammer as an
instrument.

In the year 1960 to 1980, key systems were:

SHRDLU

SHRDLU is a program written by Terry Winograd in 1968-70. It helps users to


communicate with the computer and moving objects. It can handle instructions such
as "pick up the green boll" and also answer the questions like "What is inside the
black box." The main importance of SHRDLU is that it shows those syntax, semantics,
and reasoning about the world that can be combined to produce a system that
understands a natural language.

LUNAR
LUNAR is the classic example of a Natural Language database interface system that is
used ATNs and Woods' Procedural Semantics. It was capable of translating elaborate
natural language expressions into database queries and handle 78% of requests
without errors.

1980 - Current

Till the year 1980, natural language processing systems were based on complex sets
of hand-written rules. After 1980, NLP introduced machine learning algorithms for
language processing.

In the beginning of the year 1990s, NLP started growing faster and achieved good
process accuracy, especially in English Grammar. In 1990 also, an electronic text
introduced, which provided a good resource for training and examining natural
language programs. Other factors may include the availability of computers with fast
CPUs and more memory. The major factor behind the advancement of natural
language processing was the Internet.

Now, modern NLP consists of various applications, like speech recognition,


machine translation, and machine text reading. When we combine all these
applications then it allows the artificial intelligence to gain knowledge of the world.
Let's consider the example of AMAZON ALEXA, using this robot you can ask the
question to Alexa, and it will reply to you.

Advantages of NLP
o NLP helps users to ask questions about any subject and get a direct response
within seconds.
o NLP offers exact answers to the question means it does not offer unnecessary
and unwanted information.
o NLP helps computers to communicate with humans in their languages.
o It is very time efficient.
o Most of the companies use NLP to improve the efficiency of documentation
processes, accuracy of documentation, and identify the information from large
databases.

Disadvantages of NLP
A list of disadvantages of NLP is given below:

o NLP may not show context.


o NLP is unpredictable
o NLP may require more keystrokes.
o NLP is unable to adapt to the new domain, and it has a limited function that's
why NLP is built for a single and specific task only.

Components of NLP
There are the following two components of NLP -

1. Natural Language Understanding (NLU)

Natural Language Understanding (NLU) helps the machine to understand and


analyse human language by extracting the metadata from content such as concepts,
entities, keywords, emotion, relations, and semantic roles.

NLU mainly used in Business applications to understand the customer's problem in


both spoken and written language.

NLU involves the following tasks -

o It is used to map the given input into useful representation.


o It is used to analyze different aspects of the language.

2. Natural Language Generation (NLG)

Natural Language Generation (NLG) acts as a translator that converts the


computerized data into natural language representation. It mainly involves Text
planning, Sentence planning, and Text Realization.

Note: The NLU is difficult than NLG.

Difference between NLU and NLG

NLU NLG

NLU is the process of reading and NLG is the process of writing or


interpreting language. generating language.
It produces non-linguistic outputs It produces constructing natural language
from natural language inputs. outputs from non-linguistic inputs.

Applications of NLP
There are the following applications of NLP -

1. Question Answering

Question Answering focuses on building systems that automatically answer the


questions asked by humans in a natural language.

2. Spam Detection

Spam detection is used to detect unwanted e-mails getting to a user's inbox.


3. Sentiment Analysis

Sentiment Analysis is also known as opinion mining. It is used on the web to analyse
the attitude, behaviour, and emotional state of the sender. This application is
implemented through a combination of NLP (Natural Language Processing) and
statistics by assigning the values to the text (positive, negative, or natural), identify
the mood of the context (happy, sad, angry, etc.)

4. Machine Translation

Machine translation is used to translate text or speech from one natural language to
another natural language.
Example: Google Translator

5. Spelling correction

Microsoft Corporation provides word processor software like MS-word, PowerPoint


for the spelling correction.

6. Speech Recognition

Speech recognition is used for converting spoken words into text. It is used in
applications, such as mobile, home automation, video recovery, dictating to
Microsoft Word, voice biometrics, voice user interface, and so on.

7. Chatbot

Implementing the Chatbot is one of the important applications of NLP. It is used by


many companies to provide the customer's chat services.
8. Information extraction

Information extraction is one of the most important applications of NLP. It is used for
extracting structured information from unstructured or semi-structured machine-
readable documents.

9. Natural Language Understanding (NLU)

It converts a large set of text into more formal representations such as first-order
logic structures that are easier for the computer programs to manipulate notations of
the natural language processing.

How to build an NLP pipeline


There are the following steps to build an NLP pipeline -

Step1: Sentence Segmentation

Sentence Segment is the first step for building the NLP pipeline. It breaks the
paragraph into separate sentences.

Example: Consider the following paragraph -

Independence Day is one of the important festivals for every Indian citizen. It is
celebrated on the 15th of August each year ever since India got independence
from the British rule. The day celebrates independence in the true sense.

Sentence Segment produces the following result:

1. "Independence Day is one of the important festivals for every Indian citizen."
2. "It is celebrated on the 15th of August each year ever since India got
independence from the British rule."
3. "This day celebrates independence in the true sense."

Step2: Word Tokenization

Word Tokenizer is used to break the sentence into separate words or tokens.

Example:

JavaTpoint offers Corporate Training, Summer Training, Online Training, and Winter
Training.
Word Tokenizer generates the following result:

"JavaTpoint", "offers", "Corporate", "Training", "Summer", "Training", "Online",


"Training", "and", "Winter", "Training", "."

Step3: Stemming

Stemming is used to normalize words into its base form or root form. For example,
celebrates, celebrated and celebrating, all these words are originated with a single
root word "celebrate." The big problem with stemming is that sometimes it produces
the root word which may not have any meaning.

For Example, intelligence, intelligent, and intelligently, all these words are originated
with a single root word "intelligen." In English, the word "intelligen" do not have any
meaning.

Step 4: Lemmatization

Lemmatization is quite similar to the Stamming. It is used to group different inflected


forms of the word, called Lemma. The main difference between Stemming and
lemmatization is that it produces the root word, which has a meaning.

For example: In lemmatization, the words intelligence, intelligent, and intelligently


has a root word intelligent, which has a meaning.

Step 5: Identifying Stop Words

In English, there are a lot of words that appear very frequently like "is", "and", "the",
and "a". NLP pipelines will flag these words as stop words. Stop words might be
filtered out before doing any statistical analysis.

Example: He is a good boy.

Note: When you are building a rock band search engine, then you do not ignore the
word "The."

Step 6: Dependency Parsing

Dependency Parsing is used to find that how all the words in the sentence are
related to each other.

Step 7: POS tags

POS stands for parts of speech, which includes Noun, verb, adverb, and Adjective. It
indicates that how a word functions with its meaning as well as grammatically within
the sentences. A word has one or more parts of speech based on the context in
which it is used.

Example: "Google" something on the Internet.

In the above example, Google is used as a verb, although it is a proper noun.

Step 8: Named Entity Recognition (NER)

Named Entity Recognition (NER) is the process of detecting the named entity such as
person name, movie name, organization name, or location.

Example: Steve Jobs introduced iPhone at the Macworld Conference in San


Francisco, California.

Step 9: Chunking

Chunking is used to collect the individual piece of information and grouping them
into bigger pieces of sentences.

Phases of NLP
There are the following five phases of NLP:
1. Lexical Analysis and Morphological

The first phase of NLP is the Lexical Analysis. This phase scans the source code as a
stream of characters and converts it into meaningful lexemes. It divides the whole
text into paragraphs, sentences, and words.

2. Syntactic Analysis (Parsing)

Syntactic Analysis is used to check grammar, word arrangements, and shows the
relationship among the words.

Example: Agra goes to the Poonam

In the real world, Agra goes to the Poonam, does not make any sense, so this
sentence is rejected by the Syntactic analyzer.

3. Semantic Analysis
Semantic analysis is concerned with the meaning representation. It mainly focuses on
the literal meaning of words, phrases, and sentences.

4. Discourse Integration

Discourse Integration depends upon the sentences that proceeds it and also invokes
the meaning of the sentences that follow it.

5. Pragmatic Analysis

Pragmatic is the fifth and last phase of NLP. It helps you to discover the intended
effect by applying a set of rules that characterize cooperative dialogues.

For Example: "Open the door" is interpreted as a request instead of an order.

Why NLP is difficult?


NLP is difficult because Ambiguity and Uncertainty exist in the language.

Ambiguity

There are the following three ambiguity -

o Lexical Ambiguity

Lexical Ambiguity exists in the presence of two or more possible meanings of the
sentence within a single word.

Example:

Manya is looking for a match.

In the above example, the word match refers to that either Manya is looking for a
partner or Manya is looking for a match. (Cricket or other match)

o Syntactic Ambiguity

Syntactic Ambiguity exists in the presence of two or more possible meanings within
the sentence.

Example:

I saw the girl with the binocular.


In the above example, did I have the binoculars? Or did the girl have the binoculars?

o Referential Ambiguity

Referential Ambiguity exists when you are referring to something using the pronoun.

Example: Kiran went to Sunita. She said, "I am hungry."

In the above sentence, you do not know that who is hungry, either Kiran or Sunita.

NLP APIs
Natural Language Processing APIs allow developers to integrate human-to-machine
communications and complete several useful tasks such as speech recognition,
chatbots, spelling correction, sentiment analysis, etc.

A list of NLP APIs is given below:

o IBM Watson API


IBM Watson API combines different sophisticated machine learning
techniques to enable developers to classify text into various custom
categories. It supports multiple languages, such as English, French, Spanish,
German, Chinese, etc. With the help of IBM Watson API, you can extract
insights from texts, add automation in workflows, enhance search, and
understand the sentiment. The main advantage of this API is that it is very
easy to use.
Pricing: Firstly, it offers a free 30 days trial IBM cloud account. You can also
opt for its paid plans.
o Chatbot API
Chatbot API allows you to create intelligent chatbots for any service. It
supports Unicode characters, classifies text, multiple languages, etc. It is very
easy to use. It helps you to create a chatbot for your web applications.
Pricing: Chatbot API is free for 150 requests per month. You can also opt for
its paid version, which starts from $100 to $5,000 per month.
o Speech to text API
Speech to text API is used to convert speech to text
Pricing: Speech to text API is free for converting 60 minutes per month. Its
paid version starts form $500 to $1,500 per month.
o Sentiment Analysis API
Sentiment Analysis API is also called as 'opinion mining' which is used to
identify the tone of a user (positive, negative, or neutral)
Pricing: Sentiment Analysis API is free for less than 500 requests per month.
Its paid version starts form $19 to $99 per month.
o Translation API by SYSTRAN
The Translation API by SYSTRAN is used to translate the text from the source
language to the target language. You can use its NLP APIs for language
detection, text segmentation, named entity recognition, tokenization, and
many other tasks.
Pricing: This API is available for free. But for commercial users, you need to
use its paid version.
o Text Analysis API by AYLIEN
Text Analysis API by AYLIEN is used to derive meaning and insights from the
textual content. It is available for both free as well as paid from$119 per
month. It is easy to use.
Pricing: This API is available free for 1,000 hits per day. You can also use its
paid version, which starts from $199 to S1, 399 per month.
o Cloud NLP API
The Cloud NLP API is used to improve the capabilities of the application using
natural language processing technology. It allows you to carry various natural
language processing functions like sentiment analysis and language detection.
It is easy to use.
Pricing: Cloud NLP API is available for free.
o Google Cloud Natural Language API
Google Cloud Natural Language API allows you to extract beneficial insights
from unstructured text. This API allows you to perform entity recognition,
sentiment analysis, content classification, and syntax analysis in more the 700
predefined categories. It also allows you to perform text analysis in multiple
languages such as English, French, Chinese, and German.
Pricing: After performing entity analysis for 5,000 to 10,000,000 units, you
need to pay $1.00 per 1000 units per month.

NLP Libraries
Scikit-learn: It provides a wide range of algorithms for building machine learning
models in Python.

Natural language Toolkit (NLTK): NLTK is a complete toolkit for all NLP techniques.

Pattern: It is a web mining module for NLP and machine learning.

TextBlob: It provides an easy interface to learn basic NLP tasks like sentiment
analysis, noun phrase extraction, or pos-tagging.

Quepy: Quepy is used to transform natural language questions into queries in a


database query language.

SpaCy: SpaCy is an open-source NLP library which is used for Data Extraction, Data
Analysis, Sentiment Analysis, and Text Summarization.

Gensim: Gensim works with large datasets and processes data streams.

Difference between Natural language and Computer


Language

Natural Language Computer Language

Natural language has a very large Computer language has a very limited
vocabulary. vocabulary.

Natural language is easily Computer language is easily understood by


understood by humans. the machines.

Natural language is ambiguous in Computer language is unambiguous.


nature.

Prerequisite
Before learning NLP, you must have the basic knowledge of Python.

Audience
Our NLP tutorial is designed to help beginners.
Problem
We assure that you will not find any problem in this NLP tutorial. But if there is any
mistake or error, please post the error in the contact form.

You might also like