The Met Office is facing new and increasing challenges because of the huge volume of data that we manage and produce. Alex Longden, who leads the team responsible for data provision for the data re-use community and private weather sectors, examines the situation in this blog.
The 335 million observations of data we store every day require huge computational capability. Our new supercomputer provides the processing power needed to manipulate the data in a timely and effective way. The complex numerical models developed by our scientists and meteorologists in turn create enormous data outputs, used for climate and weather prediction and by data users throughout the world to make weather outlooks more accurate than ever before.
The state of weather data infrastructure
The Met Office recognises that increases in observational and forecast data volumes have implications across both the public and private weather sectors. To understand this better, we recently partnered with the Open Data Institute, to carry out a review on ‘The state of weather data infrastructure’.
The review is encouraging discussion on how the global weather data infrastructure can be sustainable, and continue to deliver value to society, as well as looking at the need for continuing investment in technical infrastructure and supercomputing resources. It also looks at the role of global, regional and national meteorological services in collecting observations and generating forecasts.
In addition, the review highlights the technology creating new data and in turn generating new, big data challenges. Supercomputing is enabling new and improved weather models which are harnessing a variety of sources of weather observations from ground, air, sea and space based monitoring and sensors. These trends exist within a wider landscape of innovation and changing consumer expectations where instant and real-time access to data is increasingly essential.
We are striving towards making our data more openly accessible and useful to realise the social economic benefits brought by the new supercomputer, which delivers ever increasing accuracy yet exponential increases in data volumes that makes this more challenging.
I look forward to the Met Office sharing their NWP data, which is I assume what you’re referring to, when you say “We are striving towards making our data more openly accessible”, because until now the Met Office have shared very little of it.
The NWP data that is most widely used on the internet in my experience is from the American GFS model which is very good. Why don’t the Met Office have a policy of freeing up their NWP data like the Americans? You could limit the area of the data you released to just Europe, or the immediate UK, and thin the grid to perhaps 0.5 x 0.5° which would ensure that file sizes never became prohibitive to download from some kind of simple FTP server.
Or then again, the Met Office could write their own NWP web viewer or mobile app so that people could view it in a similar way to what Ventusky have done with GFS data.
Either way you had better get a move on, because at the moment you are not doing either.
Over a fortnight and still pondering ‘do we let it through or what’ – maybe he even has a point? Nah, keep it in moderation and keep him waiting, after all we hold the monopoly not only on all climate data but who can comment on our blog…
Hi Alex, was good to meet you yesterday. Thanks for sparing some of your time and going through some of the above with me.
I enjoyed reading this, thank you for including the link to “Big Data Challenges”