Building Your First Neural Network from Scratch: A Step-by-Step Python Tutorial

Building Your First Neural Network from Scratch: A Step-by-Step Python Tutorial

When I first heard about neural networks, they sounded like something only big tech companies could build. But the truth is, you don’t need a PhD to create your own basic neural network — just some Python knowledge and curiosity.

In this tutorial, I’ll guide you through building a simple neural network from scratch. No libraries like TensorFlow or PyTorch — just pure Python and NumPy. This helped me understand how neural networks work at their core, and I hope it does the same for you.

🧠 What Is a Neural Network?

A neural network is a set of algorithms inspired by how the human brain works. It’s used in machine learning to find patterns, classify data, and make predictions.

Think of it as layers of neurons (or “nodes”) where each node takes input, does a calculation, and passes the result forward.

🔧 What You’ll Need

  • Basic knowledge of Python
  • Python installed on your machine
  • NumPy library

Let’s start building!

Step 1: Import Libraries

import numpy as np

Step 2: Define the Activation Function

def sigmoid(x):
    return 1 / (1 + np.exp(-x))

def sigmoid_derivative(x):
    return x * (1 - x)

Step 3: Prepare the Training Data

# Input data
inputs = np.array([
    [0, 0],
    [0, 1],
    [1, 0],
    [1, 1]
])

# Output data
outputs = np.array([[0], [1], [1], [0]])  # XOR Problem

Step 4: Initialize Weights

np.random.seed(1)
weights = 2 * np.random.random((2, 1)) - 1

Step 5: Train the Network

for epoch in range(10000):
    input_layer = inputs
    weighted_sum = np.dot(input_layer, weights)
    predictions = sigmoid(weighted_sum)
    
    error = outputs - predictions
    adjustments = error * sigmoid_derivative(predictions)
    
    weights += np.dot(input_layer.T, adjustments)

Step 6: Test the Output

print("Final predictions after training:")
print(predictions)

🚀 Final Thoughts

You just built a simple neural network from scratch! While this is a basic example (and can’t solve XOR perfectly without hidden layers), it teaches core concepts like:

  • Forward propagation
  • Sigmoid activation
  • Error calculation
  • Weight updates via backpropagation

Once you’re comfortable with this, you can move on to adding hidden layers or using frameworks like TensorFlow or Keras for more complex models.

Keep building, keep experimenting, and most of all — enjoy the process of learning AI from the ground up!

Post a Comment

0 Comments