Skip to the content.

← Back

Welcome to the exciting world of neural networks! If you’ve ever wondered how machines can learn to recognize images, understand speech, or even play games better than humans, you’re about to discover the magic behind it all. Today, we’ll build your very first neural network from scratch – no PhD in mathematics required!

What Exactly Is a Neural Network?

Think of a neural network like your brain, but much simpler. Your brain has billions of neurons that pass electrical signals to each other. When you see a cat, certain neurons fire in patterns that help you recognize “Hey, that’s a cat!”

A neural network works similarly:

Imagine you’re teaching a friend to recognize cats in photos. You’d show them hundreds of cat pictures, pointing out features like “fuzzy ears,” “whiskers,” and “four legs.” After seeing enough examples, they’d start recognizing cats on their own. Neural networks learn the same way!

The Building Blocks: Neurons and Layers

What’s a Neuron?

In our artificial world, a neuron is incredibly simple:

  1. It receives inputs (numbers)
  2. It multiplies each input by a weight (importance factor)
  3. It adds everything up
  4. It applies an activation function (decides if it should “fire”)

Think of it like a voting system. Each input is a voter, and the weight is how much you trust that voter’s opinion. The activation function is your final decision: “Yes” or “No.”

Layers: Organizing Our Neurons

Neural networks organize neurons in layers:

Our First Project: Predicting House Prices

Let’s build something practical! We’ll create a neural network that predicts house prices based on size and location score. This is perfect for beginners because:

Step 1: Setting Up Your Environment

First, let’s install the tools we need. Open your terminal or command prompt:

1
pip install numpy matplotlib

That’s it! We’re keeping it simple with just NumPy for math and Matplotlib for visualizations.

Step 2: Understanding Our Data

Let’s imagine we have data about houses:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
import numpy as np
import matplotlib.pyplot as plt

# Our training data
# Format: [size_in_sqft, location_score]
X = np.array([
    [1200, 6], [1500, 7], [1800, 8], [2000, 9], [2200, 7],
    [1000, 5], [2500, 9], [1700, 6], [2100, 8], [1300, 5]
])

# Actual house prices (in thousands)
y = np.array([200, 250, 320, 380, 310, 180, 420, 280, 360, 220])

print("House data:")
print("Size | Location | Price")
for i in range(len(X)):
    print(f"{X[i][0]} | {X[i][1]}       | ${y[i]}k")

Step 3: Building Our Neural Network Class

Now for the fun part! Let’s build our neural network step by step:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
class SimpleNeuralNetwork:
    def __init__(self):
        # Initialize random weights
        # We need weights for each input plus a bias
        np.random.seed(42)  # For reproducible results
        
        # Hidden layer: 2 inputs -> 4 neurons
        self.weights_input_hidden = np.random.normal(0, 1, (2, 4))
        self.bias_hidden = np.random.normal(0, 1, (1, 4))
        
        # Output layer: 4 neurons -> 1 output
        self.weights_hidden_output = np.random.normal(0, 1, (4, 1))
        self.bias_output = np.random.normal(0, 1, (1, 1))
        
    def sigmoid(self, x):
        """Activation function - squashes values between 0 and 1"""
        return 1 / (1 + np.exp(-np.clip(x, -500, 500)))  # Clip to prevent overflow
    
    def sigmoid_derivative(self, x):
        """Derivative of sigmoid - needed for learning"""
        return x * (1 - x)
    
    def forward_pass(self, X):
        """Forward pass - make a prediction"""
        # Hidden layer
        self.hidden_input = np.dot(X, self.weights_input_hidden) + self.bias_hidden
        self.hidden_output = self.sigmoid(self.hidden_input)
        
        # Output layer
        self.output_input = np.dot(self.hidden_output, self.weights_hidden_output) + self.bias_output
        self.predicted_output = self.sigmoid(self.output_input)
        
        return self.predicted_output
    
    def backward_pass(self, X, y, predicted_output):
        """Backward pass - learn from mistakes"""
        m = X.shape[0]  # Number of examples
        
        # Calculate error
        output_error = y - predicted_output
        
        # Output layer gradients
        output_delta = output_error * self.sigmoid_derivative(predicted_output)
        
        # Hidden layer error
        hidden_error = output_delta.dot(self.weights_hidden_output.T)
        hidden_delta = hidden_error * self.sigmoid_derivative(self.hidden_output)
        
        # Update weights and biases
        self.weights_hidden_output += self.hidden_output.T.dot(output_delta) / m
        self.bias_output += np.sum(output_delta, axis=0, keepdims=True) / m
        self.weights_input_hidden += X.T.dot(hidden_delta) / m
        self.bias_hidden += np.sum(hidden_delta, axis=0, keepdims=True) / m
    
    def train(self, X, y, epochs=1000, learning_rate=0.1):
        """Train the neural network"""
        losses = []
        
        for epoch in range(epochs):
            # Forward pass
            predicted_output = self.forward_pass(X)
            
            # Calculate loss (mean squared error)
            loss = np.mean((y - predicted_output) ** 2)
            losses.append(loss)
            
            # Backward pass
            self.backward_pass(X, y, predicted_output)
            
            # Print progress
            if epoch % 100 == 0:
                print(f"Epoch {epoch}, Loss: {loss:.4f}")
        
        return losses
    
    def predict(self, X):
        """Make predictions on new data"""
        return self.forward_pass(X)

Step 4: Preparing and Training Our Network

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
# Normalize our data (very important!)
def normalize_data(data):
    return (data - np.mean(data, axis=0)) / np.std(data, axis=0)

# Normalize inputs
X_normalized = normalize_data(X.astype(float))

# Normalize outputs (scale to 0-1 range for sigmoid)
y_normalized = (y - np.min(y)) / (np.max(y) - np.min(y))
y_normalized = y_normalized.reshape(-1, 1)  # Make it a column vector

print("Normalized data ready for training!")
print("Input shape:", X_normalized.shape)
print("Output shape:", y_normalized.shape)

# Create and train our neural network
nn = SimpleNeuralNetwork()
print("\nTraining neural network...")
losses = nn.train(X_normalized, y_normalized, epochs=2000)

# Plot training progress
plt.figure(figsize=(10, 6))
plt.plot(losses)
plt.title('Training Loss Over Time')
plt.xlabel('Epoch')
plt.ylabel('Loss')
plt.grid(True)
plt.show()

Step 5: Testing Our Neural Network

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
# Test our network on the training data
predictions_normalized = nn.predict(X_normalized)

# Convert predictions back to original scale
predictions = predictions_normalized * (np.max(y) - np.min(y)) + np.min(y)
predictions = predictions.flatten()

print("\nResults:")
print("Actual vs Predicted Prices:")
print("Size | Location | Actual | Predicted | Difference")
print("-" * 50)

for i in range(len(X)):
    diff = abs(y[i] - predictions[i])
    print(f"{X[i][0]} | {X[i][1]}       | ${y[i]}k   | ${predictions[i]:.0f}k      | ${diff:.0f}k")

# Calculate accuracy
mae = np.mean(np.abs(y - predictions))
print(f"\nMean Absolute Error: ${mae:.1f}k")

# Test on new data
print("\nTesting on new houses:")
new_houses = np.array([[1600, 7], [2300, 9], [1100, 4]])
new_houses_normalized = normalize_data(new_houses.astype(float))

new_predictions_normalized = nn.predict(new_houses_normalized)
new_predictions = new_predictions_normalized * (np.max(y) - np.min(y)) + np.min(y)
new_predictions = new_predictions.flatten()

for i, house in enumerate(new_houses):
    print(f"House {house[0]} sqft, location {house[1]}/10: ${new_predictions[i]:.0f}k")

Step 6: Visualizing Our Results

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
# Create a beautiful visualization
fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(15, 6))

# Plot 1: Actual vs Predicted
ax1.scatter(y, predictions, alpha=0.7, s=100)
ax1.plot([min(y), max(y)], [min(y), max(y)], 'r--', lw=2)
ax1.set_xlabel('Actual Price ($k)')
ax1.set_ylabel('Predicted Price ($k)')
ax1.set_title('Actual vs Predicted House Prices')
ax1.grid(True, alpha=0.3)

# Plot 2: Feature importance visualization
sizes = X[:, 0]
locations = X[:, 1]
colors = y

scatter = ax2.scatter(sizes, locations, c=colors, cmap='viridis', s=100, alpha=0.7)
ax2.set_xlabel('House Size (sqft)')
ax2.set_ylabel('Location Score')
ax2.set_title('Houses by Size and Location (Color = Price)')
plt.colorbar(scatter, ax=ax2, label='Price ($k)')

plt.tight_layout()
plt.show()

Understanding What Just Happened

Congratulations! You just built and trained your first neural network. Let’s break down what happened:

The Learning Process

  1. Random Start: Your network began with random weights (like a baby’s brain)
  2. Make Predictions: It looked at house size and location, made price guesses
  3. Calculate Error: It compared its guesses to actual prices
  4. Adjust Weights: It tweaked its internal connections to reduce errors
  5. Repeat: This process happened 2000 times, each time getting slightly better

Why Normalization Matters

We normalized our data (made all values similar scales) because:

The Magic of Backpropagation

The “backward pass” is where learning happens:

Taking It Further

Now that you’ve built your first neural network, here are exciting next steps:

1. Experiment with Architecture

1
2
3
4
5
6
7
8
# Try different numbers of hidden neurons
self.weights_input_hidden = np.random.normal(0, 1, (2, 8))  # 8 instead of 4
self.bias_hidden = np.random.normal(0, 1, (1, 8))

# Try multiple hidden layers
# Layer 1: 2 -> 6 neurons
# Layer 2: 6 -> 4 neurons  
# Layer 3: 4 -> 1 output

2. Add More Features

1
2
3
4
5
6
7
# Include more house features
X = np.array([
    # [size, location, bedrooms, bathrooms, age]
    [1200, 6, 3, 2, 5],
    [1500, 7, 3, 2, 3],
    # ... more data
])

3. Try Different Activation Functions

1
2
3
4
5
6
7
def relu(self, x):
    """ReLU activation - popular in modern networks"""
    return np.maximum(0, x)

def tanh(self, x):
    """Tanh activation - alternative to sigmoid"""
    return np.tanh(x)

4. Implement Better Optimization

1
2
3
4
5
6
7
# Add momentum to weight updates
def train_with_momentum(self, X, y, epochs=1000, learning_rate=0.1, momentum=0.9):
    # Initialize momentum terms
    self.v_weights_hidden_output = np.zeros_like(self.weights_hidden_output)
    self.v_weights_input_hidden = np.zeros_like(self.weights_input_hidden)
    
    # ... training loop with momentum updates

Common Beginner Mistakes (And How to Avoid Them)

1. Forgetting to Normalize Data

Problem: Network doesn’t learn or learns very slowly Solution: Always normalize your inputs to similar scales

2. Learning Rate Too High or Low

Problem: Network doesn’t converge or learns too slowly Solution: Try values between 0.001 and 0.1, adjust based on loss curve

3. Not Enough Training Data

Problem: Network memorizes instead of learning patterns Solution: Gather more diverse data, use techniques like data augmentation

4. Overfitting

Problem: Great on training data, terrible on new data Solution: Use validation sets, add regularization, or get more data

What’s Next on Your AI Journey?

You’ve just taken your first step into artificial intelligence! Here’s your roadmap:

Immediate Next Steps

  1. Experiment: Change parameters, add features, try different data
  2. Learn TensorFlow/PyTorch: Industry-standard deep learning frameworks
  3. Study Math: Linear algebra and calculus will deepen your understanding
  4. Build Projects: Image classification, text analysis, game AI

Medium-Term Goals

  1. Convolutional Neural Networks: For image recognition
  2. Recurrent Neural Networks: For sequences and time series
  3. Deep Learning Specialization: Coursera courses by Andrew Ng
  4. Kaggle Competitions: Practice with real-world datasets

Long-Term Vision

  1. Transformer Networks: The architecture behind GPT and BERT
  2. Generative AI: Create art, write stories, compose music
  3. Reinforcement Learning: Train AI to play games and control robots
  4. Research: Contribute to the cutting edge of AI

Conclusion

You’ve just built a neural network from scratch! You started with random numbers and math, and ended with a system that can predict house prices. That’s genuinely impressive.

Remember, every AI expert started exactly where you are now. The key is consistent practice and never-ending curiosity. Your neural network might be simple, but it contains the same fundamental principles that power the most advanced AI systems in the world.

The brain you just built electronically can:

These are the building blocks of artificial intelligence, and you now understand them from the ground up.

Keep experimenting, keep learning, and most importantly, keep building. The future of AI is in your hands!


Ready for more? Check out my next post: “Building a Neural Network to Recognize Handwritten Digits” where we’ll tackle computer vision with the famous MNIST dataset!

Tags: neural-networks, machine-learning, python, beginner-friendly, ai, deep-learning, tutorial

Share this post: If you found this helpful, share it with other aspiring AI engineers!