Chapter 4: Opportunities and enablers of change
4.1 Changes in technology and data sharing
Knowing where people and things are, and their relationship to each other, is essential for informed decision-making. Real-time information is useful to prepare for and respond to disasters. Location-based services are helping governments to develop strategic priorities, make decisions, and measure and monitor outcomes.
As identified by the Global Facility for Disaster Reduction and Recovery (GFDRR), for communities and governments to build resilience to hazards, they must have access to information about disaster risk that is understandable and actionable. Advances in science, technology and innovation can further the understanding of disaster risk and help achieve this goal. Especially when a wide variety of stakeholders across the public, private, academic and NGO sectors form partnerships and work together.
Improvements in technology have been exponential since the publication of GAR15. This, coupled with the increased awareness and willingness to share data, information and data processing capabilities, has enabled a greater understanding of global change and the ability to forecast how natural systems will respond to human activity and political decisions.
Ongoing efforts to engage the science and technology community in developing, implementing and providing data and services to the risk management community are being strengthened. This ensures that the DRR community benefits from the best possible scientific and technological advances and advice. One of the greatest areas of technological enhancement has been in the availability of, and access to, computational processing power. This can be seen through the greater availability of supercomputers and virtual servers, which have increased the overall availability of cloud-based computing capabilities for hazard modelling. In turn, the data available has also improved. As an example, the ESA Copernicus satellite marks a significant improvement in globally available, open, high-resolution satellite imagery
4.1.1 Hazard knowledge
Data collected on the Earth systems (climate, oceans, land and weather), as well as the societal systems (population location, density and vulnerability), is a fundamental input for many of the calculations to permit a better understanding of the nature and drivers of risk.
The science and technology community have an essential role in the continual advancement of the understanding of hazards, exposure and vulnerability and its effect on reducing the risks to people, infrastructure and society. Satellites have a unique vantage point for monitoring many kinds of large-scale processes, from forest fires to overflowing rivers, to earthquake-prone zones as well as patterns of human settlement, herd migration trends and degradation of coral reefs. Remotely sensed data can be provided in near real time. This can include maps, optical images or radar images that accurately measure the affected areas.
4.1.2 Open data
Open data can have many different interpretations and meanings. Here, open data is described as "data that can be freely used, re-used and redistributed by anyone - subject only, at most, to the requirements to attribute and share alike."
Open data policies have been shown to be an economic force enhancer for nations, with value created many times over and providing greater returns on investment through increased tax revenues on the products and services created with the data. Open data also meets society's needs for ethical principles for accessing and using public data. Within the research and innovation sectors, open data can facilitate interdisciplinary, inter-institutional and international research. It also enables data mining for automated knowledge discovery among the growing amount of big data available to researchers and policymakers. Finally, open public data supports improved decision-making and enhances transparency in government and society.
An open science approach, complementing open data principles, is often followed by research and academic institutions. This works on the basis that data is as open as possible but recognizes that it can be closed if necessary. The findable, accessible, interoperable and reusable (FAIR) data principles are also a core facet of open and exchangeable knowledge.
Figure 4.1. FAIR data is findable, accessible, interoperable and reusable
(Source: UNISDR 2019: https://www.nature.com/articles/sdata201618)
4.1.3 Open source software
Open source software can be described as the provision of source code that is available at no cost and for use by anyone for any purpose. The opposite of open source software is proprietary software, where a user normally must pay to access the software and abide by several restrictions in its use and distribution.
Open source software was rare 10 years ago, but it is now commonplace. Perhaps the greatest benefit of open source tools is their flexibility and evolving capacity that develops as more people use and adapt the software for their specific needs. Shared software helps promote greater levels of understanding of hazards rooted in the same methodology.
Community-driven open source software is increasingly being used in government organizations, and there is a growing number of private sector companies focused on providing technical support to open source software. This movement by governments to use open source software has gone a long way in overcoming barriers to adoption. As with any technology, significant assessments need to be made on the total cost of ownership of open source software. While there may be an initial economic benefit from using open source software, it can be expensive to customize and maintain, as this is dependent on the community developing the software, and the knowledge of the user.
Future-proofing is also a consideration. With open source software, the software itself is less likely to be affected if the company behind its design closes. As other developers can simply pick up where the original ones left off, its sustainability is better ensured. The vision of future-proofing underpins this philosophy. If the base information is available and comprehensible broadly, the likelihood of continued interest in and research about the topic is more likely to continue. These systems emphasize testing and continuous integration where every change in the engine is reviewed by someone else and can include a scientific review and publication. When a change goes into the system, all tests are re-run. Having the whole processes visible and transparent ensures that if a bug is fixed, it will often result in improvements to the tests.
Open software and tools are becoming the software of choice within research institutions. In the early stages, open source implied a free but often primitive version of the commercial software. However, in the last few years, open source software has progressed exponentially and often represents best-in-class versions of scientific modelling tools. With the science rooted in open source tools, more users have access to them, enabling greater contributions and allowing their knowledge and research to feed back into improved development of the tool itself.
4.1.4 Interoperability
Interoperability may be defined as "the ability of a computer system of software to work with other systems or products without special effort on the part of the user." The interoperability of data has technical, semantic and legal dimensions. From a technical standpoint, the data needs to have compatible formats and well-known quantities that make diverse data possible to integrate to form new data and products.
From the semantic point of view, one of the main challenges to interoperability is contained within the metadata used to describe any given data set. When trying to combine data, challenges can be as simple as the native language of the data creator being different from the data user, meaning that it can be difficult to combine. Another semantic challenge can be with the naming conventions and descriptive terms used in different disciplines (or even subdisciplines). These issues of nomenclature are very important, especially for identifying and measuring risks and hazards.
Legal interoperability can be described as having occurred when multiple data sets from different sources have been merged, and users are able to access and use each of the different data sets without having to seek explicit authorization from each creator of the data.
It is not only the interoperability of data and systems that is important for disaster risk management. DRR is inherently interdisciplinary, and this is reflected in the discussions around cascading risks and hazards. Researchers and professionals often work in silos within their own disciplines. Improving the availability of knowledge and data can encourage practitioners to think about the wider implications of risk-informed decisions.
4.1.5 Data science
The ability to create data is still ahead of the ability to solve complex problems by using the data. There is no doubt that there is a huge amount of value yet to be gained from the information contained within the data generated. The growth in the amount of data collected brings with it a growing requirement to be able to find the right information at the right time, and challenges of how to store, maintain and use the data collected.
The concept of using computer science and computational processing in science and technology is not new. For nearly two decades, there have been evolving practices and processes in the use of data science. What is becoming more mainstream is the shift to a context where there is no longer a reliance on costly supercomputers to host and process data. The growth of cloud computing, using a distributed network of computing where processes can run parallel on many machines, is lowering the cost of entry for many users. This means that there is now greater uptake and use of cloud computing for risk management. Coupling this with the developments in machine learning and artificial intelligence allows greater interactions within disparate data sets and enables more granular modelling of the drivers of risk.
The cloud computing model is becoming the prevailing mode of work for most medium- and large-scale global data sets, including Earth observation (EO) applications. This is due to the ability of cloud services to archive large satellite-generated data sets and provide the computing facilities to process them.
As cloud computing services are being more widely used, the technology is maturing rapidly. Taking the example of EO analysis as a use case, there are many different platforms and applications available for the risk community to use. These include the Open Data Cube, Copernicus Data and Information Access Services, Earth on Amazon Web Services, Google Earth Engine, the JRC Earth Observation Data and Processing Platform, NASA Earth Exchange, and the European Centre for Medium-Range Weather Forecasts Climate Data Store.
Each of these cloud computing services has different benefits. These range from the way the data is ingested (some include pre-loaded data, which reduces the effort on the part of the user) to scripting language (which is used for the processing). One of the main disadvantages of using cloud services is their lack of interoperability. This means that for users, there must be a trade-off between flexibility and ease of use. For example, Amazon Web Services are flexible, but they require users to be capable of developing applications using basic content libraries. This flexibility comes at the cost of needing to have a steep learning curve. By contrast, Google Earth Engine provides immediate access to functions and data, reducing the barrier to entry.
4.2 Conclusions
It is clear from recent developments that open data and analysis, shared and interoperable software, computing power and other technology, are the technical enablers of improved data science, risk assessment and risk modelling. For their success, they also rely on the willingness of people to work with other disciplines, across cultural, language and political boundaries, and to create the right regulatory environment for new and urgent work to proceed.