Disaster management research in Queensland should be:
- responsive by aligning to state and federal strategic directions as well as reflecting sector/practitioner-identified issues and opportunities
- collaborative via promotion of links between researchers, policy makers and practitioners to
- frame the problems to be tackled and the questions that need to be answered
- undertake the research and ensure methodologies are appropriate for the questions being asked
- interpret and share research to support continual improvement and build knowledge
- accessible, practical and actionable by practitioners,
- accountable through the use of contestable, ethical and responsible processes.
The Office of IGEM plays a crucial role in enabling a sector-wide, collaborative approach to research across all elements of disaster management.
The Research and Continuous Improvement Cycle (RCIC) is an ongoing process to support and promote collaboration and the development, translation and use of research by the disaster management sector. The RCIC is based on four stages:

Both the Queensland Government and Federal Cabinet Implementation Unit have a range of resources and templates designed to assist agencies with the process of planning, managing, implementing and evaluating projects. These can also be used to help scope and plan research questions and outcomes.
The first stage of planning is to identify the problem and questions around which a research project could be developed.
A Strategic Assessment of Service Requirement facilitates the strategic business decision of whether a project (or research) response is required to address an identified service need.
This can involve the following activities:
- Describing the service need or opportunity to be addressed (including its context, background, and the nature of the service gap and need) - why is this research important? How will this add to the knowledge in this area? What are the research questions?
- A basic review of existing research around the problem/question. There are many online resources to assist with reviewing existing literature and research:
- QUT Cite Write
- UQ Reviewing Literature - A Short Guide for Research Students explaining how addressing the service need or opportunity is congruent with, and will contribute to, the agency's mission and vision and government priorities
- Defining the outcome sought/targeted e.g., to influence change to a policy, plan or program; modify behaviour/action; encourage investment? Why is this outcome being sought and who is currently being affected? As well as where within the agency does this outcome belong, who are the owners, who is impacted (positively as well as negatively)? Outcome Mapping can assist with this process.
- Identifying and reporting any business or political sensitivities associated with doing the research and/or implementing change.
When developing a proposal for research, also consider:
- The capabilities and capacities required for the project, do these exist within the agency or would external subject-matter experts be required?
- How and when to collaborate - with other agency units, external agencies, regional partners, existing networks and the tertiary sector. Consider co-investigator roles in conjunction with academics (agency representatives participate in the research processes as an investigator), is there an opportunity to use students for subject-based projects or internships? Inspector-General Emergency Management can assist with developing collaborations between government and the tertiary sector.
- How funding will be sought/obtained - internal agency budgets, external grants? Can the agency provide in-kind (data, expertise, facilitation of meetings and seminars etc.) as well as cash contributions? Values should be calculated based on the most likely actual cost, e.g. current market, current internal provider rates/valuations/rentals/charges of the cost of labour, work spaces, equipment and databases etc.8 There are online resources available to help calculate in-kind contributions - In-Kind Contribution Guide
- What governance framework is required? How will updates be provided and to whom? What are the approval processes (internal agency and external)
- Is ethics clearance required and what are the processes?
- To whom does the research belong and what about intellectual property rights? Does legal advice need to be sought for contract development?
- How will the outcomes and outputs be evaluated? What are the measures of success?
Once completed, the research proposal/strategic assessment of need documentation can be submitted through internal review and approval processes. If successful, a full business case and project plan will be required. Templates and guides to assist the development of a full research project plan/ business case can be found at: OnQ Project Management Framework
In addition to the information provided in the initial proposal, a detailed research project plan should also consider/include:
- What's in and out of scope - project boundaries
- Any links to the Standard for Disaster Management and Accountabilities?
- Methods and a data collection plan (including a research frame) for example - Chapter 3 The Lessons Management System
- A detailed budget and timeline
- Risk identification and management
- The development of any contracts and Memoranda of Understanding
- A stakeholder engagement plan
- A communications plan, including how, where and when will results be published/shared
- A monitoring and evaluation plan
Finding the literature
The definition of a research question will include the identification of key words and topics that can be used to find relevant literature and existing studies. There are a number of sector-accessible database/subscription services that can be used to obtain research publications and literature9. In the disaster management sector, research is not the only evidence that can be used to support decision making. Other key sources of data/evidence can also include results from scheduled reviews, debriefings, post incident reviews, audit, and external reviews and inquiries. The Australian Institute of Disaster Resilience Knowledge Hub provides a dashboard with links to industry-specific resources. Google Scholar now provides access to a broader range of academic publications. Industry associations such as the Australian Fire Authorities Council (AFAC) and the Natural Hazards Research Australia (NHRA) also hold a mix of technical and academic resources, accessible via free, online subscriptions.
Key points to consider when reviewing existing work/literature/research/evidence include:
- What were the conclusions, including any implications for implementation?
- How was it carried out/data collected and analysed?
- What are the gaps, next steps? (this will help situate the current question/problem)
Reference management programs, such as End Note, provide systematic storing of relevant literature/research/articles, as well as assisting during the report writing/referencing/citation process and generation of bibliographies. Many of these programs also have online/cloud storage options that can be accessed without requiring additional software installations.
Designing the Research Framework
A research design outlines the 'framework in which research is undertaken', including data collection and analysis. Research can be qualitative, quantitative or a combination of both, also known as a mixed methods approach. While there is a range of theory underpinning research approaches, the United Kingdom Department of International Development notes that it is “more important - and easier - to understand the assumptions that underpin these ways of doing research”10, with three broad groups of research design:
- Some research designs are better suited for demonstrating the presence of a causal relationship, such as experimental and quasi-experimental designs
- Others are more appropriate for explaining such causal relationships
- While some designs are more useful for describing political, social and environmental contexts - such as observational studies.
Collecting the data
There are multiple ways of collecting data. Quantitative (numerical) data collection techniques include:
- Observing and recording well-defined events (e.g. counting the number of Emergency Alert campaigns)
- Surveys with closed-ended questions (e.g. face-to face and telephone interviews, questionnaires etc.)
- Factual surveys: used to collect descriptive information e.g. census
- Attitude surveys - e.g. an opinion poll that measures responses on a scale of 1 to 5 etc.
The Queensland Government Statistician's Office (QGSO) provide a range of data sets for Queensland, as well as guidance on survey/question design, collection and analysis. The QGSO can also provide statistical advisory services for agencies collecting and analysing quantitative information.
A range of government data sets are now accessible online via http://data.gov.au/. In Queensland, each agency has a strategy to support data sharing, these can be found online at https://data.qld.gov.au/article/department-strategies
Data sharing allows maximum use of data for statistical purposes, thus enhancing the decision-making capability of governments and communities. It is an important ingredient for supporting evidence-based policy and decision-making.
Qualitative data is represented either in a verbal or narrative format. Qualitative data collection methods:
- in-depth interviews and surveys with open ended questions
- observations
- event debriefs
- social, print and electronic media
- documentation review and data mining
- self-reporting
- focus groups and facilitated discussions
- action research.
It is important to note that data collected during research projects must be appropriately stored for specified periods of time (dependent on the type and source). The Queensland Government has a range of guidance on records/information management, storage and retention. If the project is being done in collaboration with a university, there will also be institute-specific guidelines for data storage and privacy.
Once data collection techniques have been selected, it is necessary to establish how much data needs to be collected and how often. Sampling techniques provide frames to identify how collection should occur and allow a level of confidence that results can be generalised back to a population. Sampling techniques can be broadly grouped as either:
- Probability - when the sample has a known probability of being selected
- Non-probability sampling - when the sample does not have known probability of being selected as in convenience or voluntary response surveys
The National Statistical Service provides online tools and resources that can assist with sampling methods, including a random sampling calculator.
Analysing the data
Analysis involves techniques and processes that assist with identifying and interpreting the information held within quantitative and qualitative data sets.
Quantitative data
There are two basic types of quantitative data analysis:
- Summary measures or descriptive statistics, for example frequency tables
- Variance measures or inferential statistics that examine differences between responses, ranges of outcomes and relationships
There is a range of online guides and resources that can support quantitative data analysis, including:
- Statsoft textbook
- Online statistics
- Basic statistics
- Introduction to statistical analysis
- Microsoft Excel includes a basic package (Analysis TOOLPAK) for analysing quantitative statistic
Qualitative data
There is a range of techniques available for analysing qualitative data:
- Content analysis - content analysis involves coding and classifying data, also referred to as categorising and indexing. The aim of content analysis is to make sense of the data collected and to highlight the important messages, features or findings.
- Grounded analysis - grounded analysis does not start from a defined point. Instead, themes emerge from discussions and conversations
- Social N=network analysis - this form of analysis examines the links between individuals as a way of understanding what motivates behaviours
- Discourse analysis - this approach not only analyses conversation, but also takes into account the social context in which the conversation occurs
- Narrative analysis - this looks at the way in which stories are told within an organisation or society to try to understand more about the way in which people think and are organised within groups.
- Conversation analysis - conversation analysis requires a detailed examination of the data, including exactly which words are used, in what order, whether speakers overlap their speech, and where the emphasis is placed. There are therefore detailed conventions used in transcribing for conversation analysis.
Online resources to assist with qualitative analysis include:
- http://www.leaptraining.leeds.ac.uk/research-methods-resources/
- https://www.antioch.edu/new-england/degrees-programs/psychology-degree/clinical-psychology-psyd/qr/
The QGSO also provides a guide for writing reports for quantitative data analysis and generating data tables and graphs. More general research report templates and writing guidance can be found at: https://www.dlsweb.rmit.edu.au/lsu/content/2_assessmenttasks/assess_pdf/research_report.pdf
The next step in the cycle is to contextualise and apply the research findings. Translating research and embedding it into practice involves two parts:
- Presenting the results and key findings
- Providing additional information that will demonstrate where/how changes to existing practice/behaviour may occur - placing the research results/key findings into the local/ agency context.
Research can be embedded into the policy/decision making cycle through conversations, networks, research synthesis (key messages in plain language that clearly link to protocols and procedures), incentives linked to milestones in contracts and education/training.
There are many frameworks that can support the translation of research into practice, with common elements including:
- Contextualising/localising the outcomes/outputs to policies, procedures, programs, agency objectives etc.
- Identifying where they can impact policy/program/plan development, processes and delivery how? e.g. links to the Key Outcomes of the Standard for Disaster Management
- Identification of potential barriers to implementation and, where possible, including these in the development of solutions
The Knowledge-to-Action Framework | CEBI developed within the health sector, consists of two cycles:
- Knowledge creation - knowledge synthesis, development of tools/products/outcomes
- Action - adapting the knowledge to the local context, assessing barriers to use, developing implementation initiatives, monitoring the use of knowledge, evaluating outcomes and sustaining knowledge use.
The KTA framework is underpinned by “sustained interactivity‟ between researchers, policy makers and practitioners, to support ongoing exchange, to provide opportunities for personal two-way communication, and to facilitate partnership approaches to research-policy initiatives.
Evaluation and validation are critical processes that ensure the effectiveness, efficiency, and accountability of program, policies, and interventions. They assess whether a program is achieving it's intended outcomes and making a positive impact, as highlighted by outcome evaluations. By identifying strengths and areas for improvement, evaluation fosters continuous learning and enhancement, supporting better performance. It provides evidence-based insights that inform the design and implementation on future programs, ensuring they are grounded in solid evidence.
Evaluation ensures accountability and transparency by demonstrating how resources are used and what outcomes are achieved. Building evaluation capacity is also crucial, as it equips organisations with the skills and knowledge needed for effective evaluation.
Online resources below can assist with the evaluation and validation process:
- Australian Centre for Evaluation (ACE)
- Queensland Government: Program Evaluation Guidelines, Second Edition 2020
- Australian Government, The Treasury: Templates, tools and resources
- Global Evaluation Initiative: BetterEvaluation platform
- Eval Community: Understanding outcome evaluation: Definition, benefits and best practices
- World Bank Group: Impact Evaluation in Practice - Second Edition
- IDinsight Impact Measurement Guide
- Australian Institute of Family Studies: Process evaluation