panhandlefamily.com

Enhanced Temperature Prediction Using LSTM Techniques

Written on

In the journey of humanity, the quest to understand natural phenomena has always been prevalent. Among the most challenging areas to predict accurately is the weather, including variables like temperature and precipitation. Throughout history, humans have relied on historical data to anticipate future outcomes, whether concerning edible foods or financial markets. For example, in signal processing, past values help us analyze characteristics of voice signals.

Time Series Analysis

As mentioned earlier, time series analysis involves forecasting future values based on historical data, yielding significant insights across various fields such as economics and healthcare. Let's delve into some fundamental concepts. The data we examine is collected at defined intervals—like recording stock prices daily. A trend represents the long-term behavior of the variable under investigation; for instance, a rising trend indicates that a stock's value is steadily increasing. Seasonality refers to recurring patterns observed within specific time frames, such as higher temperatures during summer months, which repeat annually. Additionally, noise can disrupt our data, leading to inconsistencies; for instance, a dataset might indicate an unexpected rainfall in August amidst a streak of sunny days, with noise appearing as outliers.

Long Short-Term Memory Networks

Now that we've covered the basics of time series forecasting, let's explore the algorithm we'll use for our predictions. While various methods exist, this article focuses on Long Short-Term Memory (LSTM) neural networks. These are a specialized subset of recurrent neural networks designed to capture long-term dependencies in sequential data, such as tracking a person's heart rate over a week. LSTMs are particularly valuable for identifying dynamic patterns and seasonal trends over extended periods. The core component of this architecture is the cell state, which retains information about long-term dependencies. In contrast, the hidden state monitors short-term memory. Each neuron within the LSTM contains three gates: the forget gate determines which information to discard from the cell state, while the input and output gates manage the information flow in and out of the neuron. Ultimately, LSTMs excel at retaining information across lengthy sequences, essential for time series data where past observations influence future predictions, while also addressing the vanishing gradient problem that traditional RNNs face.

Dataset and Implementation

To implement our learning, we will utilize a dataset sourced from Kaggle. This dataset contains seven columns, with the first three representing the year, month, and day. The remaining columns include specific humidity, relative humidity, temperature, and precipitation. Our objective is to forecast temperature and rainfall trends. Initially, I imported the necessary libraries and displayed statistical details about the data. Data preprocessing follows, where I employed the MinMaxScaler to normalize the sequential data for LSTM compatibility. Subsequently, I transformed the time series data into fixed-length sequences (time_step), a typical practice when preparing data for LSTM models. Each sequence serves as input to predict the subsequent value. The model architecture adopts a bidirectional approach, allowing information propagation in both directions, which can uncover more intricate patterns. I've incorporated dense layers and established a dropout factor to mitigate overfitting. The Adam optimizer was selected, and mean squared error was chosen as the performance metric. Finally, I implemented early stopping to monitor validation loss and halt training if improvement ceases, further helping to prevent overfitting.

Code

# Import necessary libraries import pandas as pd import numpy as np import matplotlib.pyplot as plt import seaborn as sns from sklearn.preprocessing import MinMaxScaler from sklearn.metrics import mean_squared_error from tensorflow.keras.models import Sequential from tensorflow.keras.layers import LSTM, Dense, Dropout, Bidirectional from tensorflow.keras.callbacks import EarlyStopping

# Load the dataset df = pd.read_csv(r"C:UsersnikosOneDrive???????????Rainfall_data.csv")

# Create a datetime column from year, month, day df['date'] = pd.to_datetime(df[['Year', 'Month', 'Day']])

# Set the date as the index for the DataFrame df.set_index('date', inplace=True)

# Data Exploration print(df.describe())

# Plot the data def plot_time_series(df, columns, title):

plt.figure(figsize=(12, 8))

for column in columns:

plt.plot(df[column], label=column)

plt.title(title)

plt.legend()

plt.show()

plot_time_series(df, ['Temperature', 'Precipitation'], 'Temperature and Precipitation Over Time') plot_time_series(df, ['Specific Humidity', 'Relative Humidity'], 'Humidity Levels Over Time')

# Data Preprocessing for LSTM def create_dataset(data, time_step=1):

X, y = [], []

for i in range(len(data) - time_step):

X.append(data[i:(i + time_step), 0])

y.append(data[i + time_step, 0])

return np.array(X), np.array(y)

# Normalize the data scaler = MinMaxScaler(feature_range=(0, 1)) scaled_data = scaler.fit_transform(df[['Temperature']])

# Prepare the data for LSTM time_step = 30 # Adjusted time step X, y = create_dataset(scaled_data, time_step) X = X.reshape((X.shape[0], X.shape[1], 1)) # Reshape for LSTM

# Split the data into train and test sets train_size = int(len(X) * 0.8) X_train, X_test = X[:train_size], X[train_size:] y_train, y_test = y[:train_size], y[train_size:]

# Build the LSTM model model = Sequential() model.add(Bidirectional(LSTM(200, return_sequences=True, input_shape=(time_step, 1)))) model.add(Dropout(0.2)) model.add(LSTM(100)) model.add(Dropout(0.2)) model.add(Dense(1))

model.compile(optimizer='adam', loss='mean_squared_error')

# Early Stopping early_stopping = EarlyStopping(monitor='val_loss', patience=10, restore_best_weights=True)

# Train the model with increased epochs history = model.fit(X_train, y_train, epochs=150, batch_size=32, validation_split=0.2, callbacks=[early_stopping], verbose=1)

# Make predictions y_pred = model.predict(X_test)

# Rescale predictions and actual values back to original scale y_test_rescaled = scaler.inverse_transform(y_test.reshape(-1, 1)) y_pred_rescaled = scaler.inverse_transform(y_pred)

# Plot actual vs forecasted plt.figure(figsize=(10, 6)) plt.plot(df.index[-len(y_test):], y_test_rescaled, label='Actual') plt.plot(df.index[-len(y_test):], y_pred_rescaled, label='Forecast') plt.title('Temperature Forecast vs Actual (LSTM)') plt.legend() plt.show()

# Calculate Mean Squared Error mse = mean_squared_error(y_test_rescaled, y_pred_rescaled) print(f'Mean Squared Error: {mse}')

# Additional: Correlation matrix heatmap plt.figure(figsize=(10, 6)) sns.heatmap(df.corr(), annot=True, cmap='coolwarm', fmt=".2f") plt.title('Correlation Heatmap') plt.show()

# Additional forecast: Forecast future values def forecast_future(model, data, time_step, future_steps):

forecast = []

input_data = data[-time_step:].reshape((1, time_step, 1))

for _ in range(future_steps):

pred = model.predict(input_data)

forecast.append(pred[0, 0])

input_data = np.roll(input_data, shift=-1, axis=1) # Shift the input data to the left

input_data[0, -1, 0] = pred # Add the prediction to the end of the sequence

return np.array(forecast)

future_steps = 30 future_forecast_scaled = forecast_future(model, scaled_data, time_step, future_steps) future_forecast = scaler.inverse_transform(future_forecast_scaled.reshape(-1, 1))

# Plot the forecasted future temperature plt.figure(figsize=(10, 6)) plt.plot(pd.date_range(start=df.index[-1], periods=future_steps + 1, freq='D')[1:], future_forecast, label='Future Forecast') plt.title('Future Temperature Forecast (LSTM)') plt.legend() plt.tight_layout() plt.show()

Results

In this concluding section, I will present the outcomes of our mini-research. Let's begin by visualizing the data:

As anticipated, precipitation levels are generally higher than those of temperature, suggesting that in a data analytics project, distinct plots for each variable might enhance clarity. By applying the same analysis, I examined specific and relative humidity:

After training the model, I compared the predictions with actual values in the following figure:

The orange line represents the neural network's predictions, while the other line indicates the actual data distribution. Given the inherent complexity of weather forecasting, the model occasionally misses certain peaks, yet it generally performs efficiently. To further substantiate my findings, I included the dataset's correlation matrix:

Finally, I visualized the future weather forecast produced by the model:

I hope this detailed exploration of LSTM networks and their application in weather forecasting has been insightful. Please feel free to reach out with any questions, and I appreciate your support!

Share the page:

Twitter Facebook Reddit LinkIn

-----------------------

Recent Post:

# A Whimsical Encounter with a Talking Butterfly: A Reflection on Nature

A humorous and reflective dialogue with a butterfly about nature and organic living.

Choosing Myself: A Journey of Self-Love Every Single Day

A heartfelt poem emphasizing the importance of self-love and choosing oneself daily.

Harnessing OpenAI and PowerShell for Bicep Language Automation

Discover how to automate Azure Bicep code creation using OpenAI and PowerShell efficiently.

How to Enhance Your Running Technique: A Comprehensive Guide

Discover effective techniques to improve your running performance across different distances, from sprinting to long-distance running.

Unlock Your Potential with These 10 Proofreading Companies

Discover beginner-friendly proofreading companies that hire newbies for flexible remote work opportunities.

Unlocking Your Full Potential: Strategies for Hard Work Success

Explore effective strategies to enhance your work ethic and achieve personal success through hard work and smart planning.

Are You Trapped in Action-Faking? Here's How to Break Free!

Discover how to recognize and overcome action-faking to achieve your goals effectively.

Exploring the Depths of Love: A Comprehensive Insight

A deep dive into the essence of love, its components, and the importance of nurturing this profound feeling.