Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
21 views2 pages

Ann 5

The document presents a Python program that implements a Back Propagation Network to solve the XOR function using binary inputs and outputs. It includes the necessary code with functions for the sigmoid activation and its derivative, as well as the training process over 10,000 epochs. The program prints the loss every 1000 epochs and displays the final outputs after training.

Uploaded by

sumitdorle91
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views2 pages

Ann 5

The document presents a Python program that implements a Back Propagation Network to solve the XOR function using binary inputs and outputs. It includes the necessary code with functions for the sigmoid activation and its derivative, as well as the training process over 10,000 epochs. The program prints the loss every 1000 epochs and displays the final outputs after training.

Uploaded by

sumitdorle91
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Pratical No : 5

Name : Thombare Rushikesh Dadaso


Class : TE AI&DS
Roll_NO : 29

Title: Write a python program to show Back Propagation Network for XOR
function with Binary Input and Output

Code:
import numpy as np
def sigmoid(x):
return 1/(1+np.exp(-x))
def sigmoid_derivative(x):
return x*(1-x)
X=np.array([[0,0],
[0,1],
[1,0],
[1,1]])
y=np.array([[0],
[1],
[1],
[0]])
np.random.seed(1)
input_size=2
hidden_size=2
output_size=1
W1=2*np.random.random((input_size,hidden_size))-1
W2=2*np.random.random((hidden_size,output_size))-1
learning_rate=0.9
epochs=10000
for epoch in range(epochs):
hidden_input=np.dot(X,W1)
hidden_output=sigmoid(hidden_input)
final_input=np.dot(hidden_output,W2)
final_output=sigmoid(final_input)
error= y - final_output
d_final=error*sigmoid_derivative(final_output)
error_hidden=d_final.dot(W2.T)
d_hidden=error_hidden*sigmoid_derivative(hidden_output)
W2+=learning_rate*hidden_output.T.dot(d_final)
W1+=learning_rate*X.T.dot(d_hidden)

if(epoch+1)%1000==0:
loss=np.mean(np.square(error))
print(f"Epoch{epoch+1}:Loss={loss:4f}")
print("\nFinal_outputs after training:")
print(final_output)
Output:

You might also like