The story of global warming is not new. Scientists have studied the connection between greenhouse gases and Earth’s temperature for over a century. The earliest recognition dates back to the 19th century, when Swedish scientist Svante Arrhenius suggested in 1896 that the burning of fossil fuels could increase carbon dioxide in the atmosphere, raising Earth’s surface temperature. His theory was ahead of its time but laid the foundation for modern climate science.
In the mid-20th century, as industrialization expanded, scientists began collecting more precise measurements. Charles David Keeling, in 1958, started recording atmospheric CO₂ levels at Mauna Loa Observatory in Hawaii. This iconic “Keeling Curve” revealed a steady rise in CO₂ levels year after year, correlating strongly with fossil fuel use.
The 1970s and 1980s saw increasing awareness. Reports from NASA and the U.S. National Academy of Sciences warned of the warming potential. By 1988, the Intergovernmental Panel on Climate Change (IPCC) was formed, bringing together thousands of scientists worldwide to study and publish findings on climate change. That same year, NASA scientist James Hansen testified before the U.S. Congress, declaring with confidence that global warming was already happening.
Public awareness grew in the 1990s with the Kyoto Protocol, the first major international treaty aimed at reducing greenhouse gas emissions. However, political resistance and lack of enforcement limited its impact. By the 2000s, with Al Gore’s documentary An Inconvenient Truth and visible climate events such as stronger hurricanes and Arctic ice loss, the issue moved further into the mainstream.