The Met Office is facing new and increasing challenges because of the huge volume of data that we manage and produce. Alex Longden, who leads the team responsible for data provision for the data re-use community and private weather sectors, examines the situation in this blog.
The 335 million observations of data we store every day require huge computational capability. Our new supercomputer provides the processing power needed to manipulate the data in a timely and effective way. The complex numerical models developed by our scientists and meteorologists in turn create enormous data outputs, used for climate and weather prediction and by data users throughout the world to make weather outlooks more accurate than ever before.
The state of weather data infrastructure
The Met Office recognises that increases in observational and forecast data volumes have implications across both the public and private weather sectors. To understand this better, we recently partnered with the Open Data Institute, to carry out a review on ‘The state of weather data infrastructure’.
The review is encouraging discussion on how the global weather data infrastructure can be sustainable, and continue to deliver value to society, as well as looking at the need for continuing investment in technical infrastructure and supercomputing resources. It also looks at the role of global, regional and national meteorological services in collecting observations and generating forecasts.
In addition, the review highlights the technology creating new data and in turn generating new, big data challenges. Supercomputing is enabling new and improved weather models which are harnessing a variety of sources of weather observations from ground, air, sea and space based monitoring and sensors. These trends exist within a wider landscape of innovation and changing consumer expectations where instant and real-time access to data is increasingly essential.
We are striving towards making our data more openly accessible and useful to realise the social economic benefits brought by the new supercomputer, which delivers ever increasing accuracy yet exponential increases in data volumes that makes this more challenging.