Search results for: law enforcement agency
134 An Exploration of Policy-related Documents on District Heating and Cooling in Flanders: A Slow and Bottom-up Process
Authors: Isaura Bonneux
Abstract:
District heating and cooling (DHC) is increasingly recognized as a viable path towards sustainable heating and cooling. While some countries like Sweden and Denmark have a longstanding tradition of DHC, Belgium is lacking behind. The Northern part of Belgium, Flanders, had only a total of 95 heating networks in July 2023. Nevertheless, it is increasingly exploring its possibilities to enhance the scope of DHC. DHC is a complex energy system, requiring a lot of collaboration between various stakeholders on various levels. Therefore, it is of interest to look closer at policy-related documents at the Flemish (regional) level, as these policies set the scene for DHC development in the Flemish region. This kind of analysis has not been undertaken so far. This paper has the following research question: “Who talks about DHC, and in which way and context is DHC discussed in Flemish policy-related documents?” To answer this question, the Overton policy database was used to search and retrieve relevant policy-related documents. Overton retrieves data from governments, think thanks, NGOs, and IGOs. In total, out of the 244 original results, 117 documents between 2009 and 2023 were analyzed. Every selected document included theme keywords, policymaking department(s), date, and document type. These elements were used for quantitative data description and visualization. Further, qualitative content analysis revealed patterns and main themes regarding DHC in Flanders. Four main conclusions can be drawn: First, it is obvious from the timeframe that DHC is a new topic in Flanders with still limited attention; 2014, 2016 and 2017 were the years with the most documents, yet this number is still only 12 documents. In addition, many documents talked about DHC but not much in depth and painted it as a future scenario with a lot of uncertainty around it. The largest part of the issuing government departments had a link to either energy or climate (e.g. Flemish Environmental Agency) or policy (e.g. Socio-Economic Council of Flanders) Second, DHC is mentioned most within an ‘Environment and Sustainability’ context, followed by ‘General Policy and Regulation’. This is intuitive, as DHC is perceived as a sustainable heating and cooling technique and this analysis compromises policy-related documents. Third, Flanders seems mostly interested in using waste or residual heat as a heating source for DHC. The harbors and waste incineration plants are identified as potential and promising supply sources. This approach tries to conciliate environmental and economic incentives. Last, local councils get assigned a central role and the initiative is mostly taken by them. The policy documents and policy advices demonstrate that Flanders opts for a bottom-up organization. As DHC is very dependent on local conditions, this seems a logic step. Nevertheless, this can impede smaller councils to create DHC networks and slow down systematic and fast implementation of DHC throughout Flanders.Keywords: district heating and cooling, flanders, overton database, policy analysis
Procedia PDF Downloads 44133 Reconceptualising the Voice of Children in Child Protection
Authors: Sharon Jackson, Lynn Kelly
Abstract:
This paper proposes a conceptual review of the interdisciplinary literature which has theorised the concept of ‘children’s voices’. The primary aim is to identify and consider the theoretical relevance of conceptual thought on ‘children’s voices’ for research and practice in child protection contexts. Attending to the ‘voice of the child’ has become a core principle of social work practice in contemporary child protection contexts. Discourses of voice permeate the legislative, policy and practice frameworks of child protection practices within the UK and internationally. Voice is positioned within a ‘child-centred’ moral imperative to ‘hear the voices’ of children and take their preferences and perspectives into account. This practice is now considered to be central to working in a child-centered way. The genesis of this call to voice is revealed through sociological analysis of twentieth-century child welfare reform as rooted inter alia in intersecting political, social and cultural discourses which have situated children and childhood as cites of state intervention as enshrined in the 1989 United Nations Convention on the Rights of the Child ratified by the UK government in 1991 and more specifically Article 12 of the convention. From a policy and practice perspective, the professional ‘capturing’ of children’s voices has come to saturate child protection practice. This has incited a stream of directives, resources, advisory publications and ‘how-to’ guides which attempt to articulate practice methods to ‘listen’, ‘hear’ and above all – ‘capture’ the ‘voice of the child’. The idiom ‘capturing the voice of the child’ is frequently invoked within the literature to express the requirements of the child-centered practice task to be accomplished. Despite the centrality of voice, and an obsession with ‘capturing’ voices, evidence from research, inspection processes, serious case reviews, child abuse and death inquires has consistently highlighted professional neglect of ‘the voice of the child’. Notable research studies have highlighted the relative absence of the child’s voice in social work assessment practices, a troubling lack of meaningful engagement with children and the need to more thoroughly examine communicative practices in child protection contexts. As a consequence, the project of capturing ‘the voice of the child’ has intensified, and there has been an increasing focus on developing methods and professional skills to attend to voice. This has been guided by a recognition that professionals often lack the skills and training to engage with children in age-appropriate ways. We argue however that the problem with ‘capturing’ and [re]representing ‘voice’ in child protection contexts is, more fundamentally, a failure to adequately theorise the concept of ‘voice’ in the ‘voice of the child’. For the most part, ‘The voice of the child’ incorporates psychological conceptions of child development. While these concepts are useful in the context of direct work with children, they fail to consider other strands of sociological thought, which position ‘the voice of the child’ within an agentic paradigm to emphasise the active agency of the child.Keywords: child-centered, child protection, views of the child, voice of the child
Procedia PDF Downloads 136132 A Study on Characteristics of Runoff Analysis Methods at the Time of Rainfall in Rural Area, Okinawa Prefecture Part 2: A Case of Kohatu River in South Central Part of Okinawa Pref
Authors: Kazuki Kohama, Hiroko Ono
Abstract:
The rainfall in Japan is gradually increasing every year according to Japan Meteorological Agency and Intergovernmental Panel on Climate Change Fifth Assessment Report. It means that the rainfall difference between rainy season and non-rainfall is increasing. In addition, the increasing trend of strong rain for a short time clearly appears. In recent years, natural disasters have caused enormous human injuries in various parts of Japan. Regarding water disaster, local heavy rain and floods of large rivers occur frequently, and it was decided on a policy to promote hard and soft sides as emergency disaster prevention measures with water disaster prevention awareness social reconstruction vision. Okinawa prefecture in subtropical region has torrential rain and water disaster several times a year such as river flood, in which is caused in specific rivers from all 97 rivers. Also, the shortage of capacity and narrow width are characteristic of river in Okinawa and easily cause river flood in heavy rain. This study focuses on Kohatu River that is one of the specific rivers. In fact, the water level greatly rises over the river levee almost once a year but non-damage of buildings around. On the other hand in some case, the water level reaches to ground floor height of house and has happed nine times until today. The purpose of this research is to figure out relationship between precipitation, surface outflow and total treatment water quantity of Kohatu River. For the purpose, we perform hydrological analysis although is complicated and needs specific details or data so that, the method is mainly using Geographic Information System software and outflow analysis system. At first, we extract watershed and then divided to 23 catchment areas to understand how much surface outflow flows to runoff point in each 10 minutes. On second, we create Unit Hydrograph indicating the area of surface outflow with flow area and time. This index shows the maximum amount of surface outflow at 2400 to 3000 seconds. Lastly, we compare an estimated value from Unit Hydrograph to a measured value. However, we found that measure value is usually lower than measured value because of evaporation and transpiration. In this study, hydrograph analysis was performed using GIS software and outflow analysis system. Based on these, we could clarify the flood time and amount of surface outflow.Keywords: disaster prevention, water disaster, river flood, GIS software
Procedia PDF Downloads 137131 Alternate Approaches to Quality Measurement: An Exploratory Study in Differentiation of “Quality” Characteristics in Services and Supports
Authors: Caitlin Bailey, Marian Frattarola Saulino, Beth Steinberg
Abstract:
Today, virtually all programs offered to people with intellectual and developmental disabilities tout themselves as person-centered, community-based and inclusive, yet there is a vast range in type and quality of services that use these similar descriptors. The issue is exacerbated by the fields’ measurement practices around quality, inclusion, independent living, choice and person-centered outcomes. For instance, community inclusion for people with disabilities is often measured by the number of times person steps into his or her community. These measurement approaches set standards for quality too low so that agencies supporting group home residents to go bowling every week can report the same outcomes as an agency that supports one person to join a book club that includes people based on their literary interests rather than disability labels. Ultimately, lack of delineation in measurement contributes to the confusion between face value “quality” and true quality services and supports for many people with disabilities and their families. This exploratory study adopts alternative approaches to quality measurement including co-production methods and systems theoretical framework in order to identify the factors that 1) lead to high-quality supports and, 2) differentiate high-quality services. Project researchers have partnered with community practitioners who are all committed to providing quality services and supports but vary in the degree to which they are actually able to provide them. The study includes two parts; first, an online survey distributed to more than 500 agencies that have demonstrated commitment to providing high-quality services; and second, four in-depth case studies with agencies in three United States and Israel providing a variety of supports to children and adults with disabilities. Results from both the survey and in-depth case studies were thematically analyzed and coded. Results show that there are specific factors that differentiate service quality; however meaningful quality measurement practices also require that researchers explore the contextual factors that contribute to quality. These not only include direct services and interactions, but also characteristics of service users, their environments as well as organizations providing services, such as management and funding structures, culture and leadership. Findings from this study challenge researchers, policy makers and practitioners to examine existing quality service standards and measurements and to adopt alternate methodologies and solutions to differentiate and scale up evidence-based quality practices so that all people with disabilities have access to services that support them to live, work, and enjoy where and with whom they choose.Keywords: co-production, inclusion, independent living, quality measurement, quality supports
Procedia PDF Downloads 399130 The Scenario Analysis of Shale Gas Development in China by Applying Natural Gas Pipeline Optimization Model
Authors: Meng Xu, Alexis K. H. Lau, Ming Xu, Bill Barron, Narges Shahraki
Abstract:
As an emerging unconventional energy, shale gas has been an economically viable step towards a cleaner energy future in U.S. China also has shale resources that are estimated to be potentially the largest in the world. In addition, China has enormous unmet for a clean alternative to substitute coal. Nonetheless, the geological complexity of China’s shale basins and issues of water scarcity potentially impose serious constraints on shale gas development in China. Further, even if China could replicate to a significant degree the U.S. shale gas boom, China faces the problem of transporting the gas efficiently overland with its limited pipeline network throughput capacity and coverage. The aim of this study is to identify the potential bottlenecks in China’s gas transmission network, as well as to examine the shale gas development affecting particular supply locations and demand centers. We examine this through application of three scenarios with projecting domestic shale gas supply by 2020: optimistic, medium and conservative shale gas supply, taking references from the International Energy Agency’s (IEA’s) projections and China’s shale gas development plans. Separately we project the gas demand at provincial level, since shale gas will have more significant impact regionally than nationally. To quantitatively assess each shale gas development scenario, we formulated a gas pipeline optimization model. We used ArcGIS to generate the connectivity parameters and pipeline segment length. Other parameters are collected from provincial “twelfth-five year” plans and “China Oil and Gas Pipeline Atlas”. The multi-objective optimization model uses GAMs and Matlab. It aims to minimize the demands that are unable to be met, while simultaneously seeking to minimize total gas supply and transmission costs. The results indicate that, even if the primary objective is to meet the projected gas demand rather than cost minimization, there’s a shortfall of 9% in meeting total demand under the medium scenario. Comparing the results between the optimistic and medium supply of shale gas scenarios, almost half of the shale gas produced in Sichuan province and Chongqing won’t be able to be transmitted out by pipeline. On the demand side, the Henan province and Shanghai gas demand gap could be filled as much as 82% and 39% respectively, with increased shale gas supply. To conclude, the pipeline network in China is currently not sufficient in meeting the projected natural gas demand in 2020 under medium and optimistic scenarios, indicating the need for substantial pipeline capacity expansion for some of the existing network, and the importance of constructing new pipelines from particular supply to demand sites. If the pipeline constraint is overcame, Beijing, Shanghai, Jiangsu and Henan’s gas demand gap could potentially be filled, and China could thereby reduce almost 25% its dependency on LNG imports under the optimistic scenario.Keywords: energy policy, energy systematic analysis, scenario analysis, shale gas in China
Procedia PDF Downloads 288129 Creative Radio Advertising in Turkey
Authors: Mehmet Sinan Erguven
Abstract:
A number of authorities argue that radio is an outdated medium for advertising and does not have the same impact on consumers as it did in the past. This grim outlook on the future of radio has its basis in the audio-visual world that consumers now live in and the popularity of Internet-based marketing tools among advertising professionals. Nonetheless, consumers still appear to overwhelmingly prefer radio as an entertainment tool. Today, in Canada, 90% of all adults (18+) tune into the radio on a weekly basis, and they listen for 17 hours. Teens are the most challenging group for radio to capture as an audience, but still, almost 75% tune in weekly. One online radio station reaches more than 250 million registered listeners worldwide, and revenues from radio advertising in Australia are expected to grow at an annual rate of 3% for the foreseeable future. Radio is also starting to become popular again in Turkey, with a 5% increase in the listening rates compared to 2014. A major matter of concern always affecting radio advertising is creativity. As radio generally serves as a background medium for listeners, the creativity of the radio commercials is important in terms of attracting the attention of the listener and directing their focus on the advertising message. This cannot simply be done by using audio tools like sound effects and jingles. This study aims to identify the creative elements (execution formats appeals and approaches) and creativity factors of radio commercials in Turkey. As part of the study, all of the award winning radio commercials produced throughout the history of the Kristal Elma Advertising Festival were analyzed using the content analysis technique. Two judges (an advertising agency copywriter and an academic) coded the commercials. The reliability was measured according to the proportional agreement. The results showed that sound effects, jingles, testimonials, slices of life and announcements were the most common execution formats in creative Turkish radio ads. Humor and excitement were the most commonly used creative appeals while award-winning ads featured various approaches, such as surprise musical performances, audio wallpaper, product voice, and theater of the mind. Some ads, however, were found to not contain any creativity factors. In order to be accepted as creative, an ad must have at least one divergence factor, such as originality, flexibility, unusual/empathic perspective, and provocative questions. These findings, as well as others from the study, hold great value for the history of creative radio advertising in Turkey. Today, the nature of radio and its listeners is changing. As more and more people are tuning into online radio channels, brands will need to focus more on this relatively cheap advertising medium in the very near future. This new development will require that advertising agencies focus their attention on creativity in order to produce radio commercials for their customers that will differentiate them from their competitors.Keywords: advertising, creativity, radio, Turkey
Procedia PDF Downloads 395128 Effect of Antimony on Microorganisms in Aerobic and Anaerobic Environments
Authors: Barrera C. Monserrat, Sierra-Alvarez Reyes, Pat-Espadas Aurora, Moreno Andrade Ivan
Abstract:
Antimony is a toxic and carcinogenic metalloid considered a pollutant of priority interest by the United States Environmental Protection Agency. It is present in the environment in two oxidation states: antimonite (Sb (III)) and antimony (Sb (V)). Sb (III) is toxic to several aquatic organisms, but the potential inhibitory effect of Sb species for microorganisms has not been extensively evaluated. The fate and possible toxic impact of antimony on aerobic and anaerobic wastewater treatment systems are unknown. For this reason, the objective of this study was to evaluate the microbial toxicity of Sb (V) and Sb (III) in aerobic and anaerobic environments. Sb(V) and Sb(III) were used as potassium hexahydroxoantimonate (V) and potassium antimony tartrate, respectively (Sigma-Aldrich). The toxic effect of both Sb species in anaerobic environments was evaluated on methanogenic activity and the inhibition of hydrogen production of microorganisms from a wastewater treatment bioreactor. For the methanogenic activity, batch experiments were carried out in 160 mL serological bottles; each bottle contained basal mineral medium (100 mL), inoculum (1.5 g of VSS/L), acetate (2.56 g/L) as substrate, and variable concentrations of Sb (V) or Sb (III). Duplicate bioassays were incubated at 30 ± 2°C on an orbital shaker (105 rpm) in the dark. Methane production was monitored by gas chromatography. The hydrogen production inhibition tests were carried out in glass bottles with a working volume of 0.36 L. Glucose (50 g/L) was used as a substrate, pretreated inoculum (5 g VSS/L), mineral medium and varying concentrations of the two species of antimony. The bottles were kept under stirring and at a temperature of 35°C in an AMPTSII device that recorded hydrogen production. The toxicity of Sb on aerobic microorganisms (from a wastewater activated sludge treatment plant) was tested with a Microtox standardized toxicity test and respirometry. Results showed that Sb (III) is more toxic than Sb (V) for methanogenic microorganisms. Sb (V) caused a 50% decrease in methanogenic activity at 250 mg/L. In contrast, exposure to Sb (III) resulted in a 50% inhibition at a concentration of only 11 mg/L, and an almost complete inhibition (95%) at 25 mg/L. For hydrogen-producing microorganisms, Sb (III) and Sb (V) inhibited 50% of this production with 12.6 mg/L and 87.7 mg/L, respectively. The results for aerobic environments showed that 500 mg/L of Sb (V) do not inhibit the Allivibrio fischeri (Microtox) activity or specific oxygen uptake rate of activated sludge. In the case of Sb (III), this caused a loss of 50% of the respiration of the microorganisms at concentrations below 40 mg/L. The results obtained indicate that the toxicity of the antimony will depend on the speciation of this metalloid and that Sb (III) has a significantly higher inhibitory potential compared to Sb (V). It was shown that anaerobic microorganisms can reduce Sb (V) to Sb (III). Acknowledgments: This work was funded in part by grants from the UA-CONACYT Binational Consortium for the Regional Scientific Development and Innovation (CAZMEX), the National Institute of Health (NIH ES- 04940), and PAPIIT-DGAPA-UNAM (IN105220).Keywords: aerobic inhibition, antimony reduction, hydrogen inhibition, methanogenic toxicity
Procedia PDF Downloads 167127 Spray Nebulisation Drying: Alternative Method to Produce Microparticulated Proteins
Authors: Josef Drahorad, Milos Beran, Ondrej Vltavsky, Marian Urban, Martin Fronek, Jiri Sova
Abstract:
Engineering efforts of researchers of the Food research institute Prague and the Czech Technical University in spray drying technologies led to the introduction of a demonstrator ATOMIZER and a new technology of Carbon Dioxide-Assisted Spray Nebulization Drying (CASND). The equipment combines the spray drying technology, when the liquid to be dried is atomized by a rotary atomizer, with Carbon Dioxide Assisted Nebulization - Bubble Dryer (CAN-BD) process in an original way. A solution, emulsion or suspension is saturated by carbon dioxide at pressure up to 80 bar before the drying process. The atomization process takes place in two steps. In the first step, primary droplets are produced at the outlet of the rotary atomizer of special construction. In the second step, the primary droplets are divided in secondary droplets by the CO2 expansion from the inside of primary droplets. The secondary droplets, usually in the form of microbubbles, are rapidly dried by warm air stream at temperatures up to 60ºC and solid particles are formed in a drying chamber. Powder particles are separated from the drying air stream in a high efficiency fine powder separator. The product is frequently in the form of submicron hollow spheres. The CASND technology has been used to produce microparticulated protein concentrates for human nutrition from alternative plant sources - hemp and canola seed filtration cakes. Alkali extraction was used to extract the proteins from the filtration cakes. The protein solutions after the alkali extractions were dried with the demonstrator ATOMIZER. Aerosol particle size distribution and concentration in the draying chamber were determined by two different on-line aerosol spectrometers SMPS (Scanning Mobility Particle Sizer) and APS (Aerodynamic Particle Sizer). The protein powders were in form of hollow spheres with average particle diameter about 600 nm. The particles were characterized by the SEM method. The functional properties of the microparticulated protein concentrates were compared with the same protein concentrates dried by the conventional spray drying process. Microparticulated protein has been proven to have improved foaming and emulsifying properties, water and oil absorption capacities and formed long-term stable water dispersions. This work was supported by the research grants TH03010019 of the Technology Agency of the Czech Republic.Keywords: carbon dioxide-assisted spray nebulization drying, canola seed, hemp seed, microparticulated proteins
Procedia PDF Downloads 168126 Towards a Strategic Framework for State-Level Epistemological Functions
Authors: Mark Darius Juszczak
Abstract:
While epistemology, as a sub-field of philosophy, is generally concerned with theoretical questions about the nature of knowledge, the explosion in digital media technologies has resulted in an exponential increase in the storage and transmission of human information. That increase has resulted in a particular non-linear dynamic – digital epistemological functions are radically altering how and what we know. Neither the rate of that change nor the consequences of it have been well studied or taken into account in developing state-level strategies for epistemological functions. At the current time, US Federal policy, like that of virtually all other countries, maintains, at the national state level, clearly defined boundaries between various epistemological agencies - agencies that, in one way or another, mediate the functional use of knowledge. These agencies can take the form of patent and trademark offices, national library and archive systems, departments of education, departments such as the FTC, university systems and regulations, military research systems such as DARPA, federal scientific research agencies, medical and pharmaceutical accreditation agencies, federal funding for scientific research and legislative committees and subcommittees that attempt to alter the laws that govern epistemological functions. All of these agencies are in the constant process of creating, analyzing, and regulating knowledge. Those processes are, at the most general level, epistemological functions – they act upon and define what knowledge is. At the same time, however, there are no high-level strategic epistemological directives or frameworks that define those functions. The only time in US history where a proxy state-level epistemological strategy existed was between 1961 and 1969 when the Kennedy Administration committed the United States to the Apollo program. While that program had a singular technical objective as its outcome, that objective was so technologically advanced for its day and so complex so that it required a massive redirection of state-level epistemological functions – in essence, a broad and diverse set of state-level agencies suddenly found themselves working together towards a common epistemological goal. This paper does not call for a repeat of the Apollo program. Rather, its purpose is to investigate the minimum structural requirements for a national state-level epistemological strategy in the United States. In addition, this paper also seeks to analyze how the epistemological work of the multitude of national agencies within the United States would be affected by such a high-level framework. This paper is an exploratory study of this type of framework. The primary hypothesis of the author is that such a function is possible but would require extensive re-framing and reclassification of traditional epistemological functions at the respective agency level. In much the same way that, for example, DHS (Department of Homeland Security) evolved to respond to a new type of security threat in the world for the United States, it is theorized that a lack of coordination and alignment in epistemological functions will equally result in a strategic threat to the United States.Keywords: strategic security, epistemological functions, epistemological agencies, Apollo program
Procedia PDF Downloads 77125 Biodegradable Self-Supporting Nanofiber Membranes Prepared by Centrifugal Spinning
Authors: Milos Beran, Josef Drahorad, Ondrej Vltavsky, Martin Fronek, Jiri Sova
Abstract:
While most nanofibers are produced using electrospinning, this technique suffers from several drawbacks, such as the requirement for specialized equipment, high electrical potential, and electrically conductive targets. Consequently, recent years have seen the increasing emergence of novel strategies in generating nanofibers in a larger scale and higher throughput manner. The centrifugal spinning is simple, cheap and highly productive technology for nanofiber production. In principle, the drawing of solution filament into nanofibers using centrifugal spinning is achieved through the controlled manipulation of centrifugal force, viscoelasticity, and mass transfer characteristics of the spinning solutions. Engineering efforts of researches of the Food research institute Prague and the Czech Technical University in the field the centrifugal nozzleless spinning led to introduction of a pilot plant demonstrator NANOCENT. The main advantages of the demonstrator are lower investment cost - thanks to simpler construction compared to widely used electrospinning equipments, higher production speed, new application possibilities and easy maintenance. The centrifugal nozzleless spinning is especially suitable to produce submicron fibers from polymeric solutions in highly volatile solvents, such as chloroform, DCM, THF, or acetone. To date, submicron fibers have been prepared from PS, PUR and biodegradable polyesters, such as PHB, PLA, PCL, or PBS. The products are in form of 3D structures or nanofiber membranes. Unique self-supporting nanofiber membranes were prepared from the biodegradable polyesters in different mixtures. The nanofiber membranes have been tested for different applications. Filtration efficiencies for water solutions and aerosols in air were evaluated. Different active inserts were added to the solutions before the spinning process, such as inorganic nanoparticles, organic precursors of metal oxides, antimicrobial and wound healing compounds or photocatalytic phthalocyanines. Sintering can be subsequently carried out to remove the polymeric material and transfer the organic precursors to metal oxides, such as Si02, or photocatalytic Zn02 and Ti02, to obtain inorganic nanofibers. Electrospinning is more suitable technology to produce membranes for the filtration applications than the centrifugal nozzleless spinning, because of the formation of more homogenous nanofiber layers and fibers with smaller diameters. The self-supporting nanofiber membranes prepared from the biodegradable polyesters are especially suitable for medical applications, such as wound or burn healing dressings or tissue engineering scaffolds. This work was supported by the research grants TH03020466 of the Technology Agency of the Czech Republic.Keywords: polymeric nanofibers, self-supporting nanofiber membranes, biodegradable polyesters, active inserts
Procedia PDF Downloads 165124 A Qualitative Study Identifying the Complexities of Early Childhood Professionals' Use and Production of Data
Authors: Sara Bonetti
Abstract:
The use of quantitative data to support policies and justify investments has become imperative in many fields including the field of education. However, the topic of data literacy has only marginally touched the early care and education (ECE) field. In California, within the ECE workforce, there is a group of professionals working in policy and advocacy that use quantitative data regularly and whose educational and professional experiences have been neglected by existing research. This study aimed at analyzing these experiences in accessing, using, and producing quantitative data. This study utilized semi-structured interviews to capture the differences in educational and professional backgrounds, policy contexts, and power relations. The participants were three key professionals from county-level organizations and one working at a State Department to allow for a broader perspective at systems level. The study followed Núñez’s multilevel model of intersectionality. The key in Núñez’s model is the intersection of multiple levels of analysis and influence, from the individual to the system level, and the identification of institutional power dynamics that perpetuate the marginalization of certain groups within society. In a similar manner, this study looked at the dynamic interaction of different influences at individual, organizational, and system levels that might intersect and affect ECE professionals’ experiences with quantitative data. At the individual level, an important element identified was the participants’ educational background, as it was possible to observe a relationship between that and their positionality, both with respect to working with data and also with respect to their power within an organization and at the policy table. For example, those with a background in child development were aware of how their formal education failed to train them in the skills that are necessary to work in policy and advocacy, and especially to work with quantitative data, compared to those with a background in administration and/or business. At the organizational level, the interviews showed a connection between the participants’ position within the organization and their organization’s position with respect to others and their degree of access to quantitative data. This in turn affected their sense of empowerment and agency in dealing with data, such as shaping what data is collected and available. These differences reflected on the interviewees’ perceptions and expectations for the ECE workforce. For example, one of the interviewees pointed out that many ECE professionals happen to use data out of the necessity of the moment. This lack of intentionality is a cause for, and at the same time translates into missed training opportunities. Another interviewee pointed out issues related to the professionalism of the ECE workforce by remarking the inadequacy of ECE students’ training in working with data. In conclusion, Núñez’s model helped understand the different elements that affect ECE professionals’ experiences with quantitative data. In particular, what was clear is that these professionals are not being provided with the necessary support and that we are not being intentional in creating data literacy skills for them, despite what is asked of them and their work.Keywords: data literacy, early childhood professionals, intersectionality, quantitative data
Procedia PDF Downloads 253123 Challenges in Environmental Governance: A Case Study of Risk Perceptions of Environmental Agencies Involved in Flood Management in the Hawkesbury-Nepean Region, Australia
Authors: S. Masud, J. Merson, D. F. Robinson
Abstract:
The management of environmental resources requires engagement of a range of stakeholders including public/private agencies and different community groups to implement sustainable conservation practices. The challenge which is often ignored is the analysis of agencies involved and their power relations. One of the barriers identified is the difference in risk perceptions among the agencies involved that leads to disjointed efforts of assessing and managing risks. Wood et al 2012, explains that it is important to have an integrated approach to risk management where decision makers address stakeholder perspectives. This is critical for an effective risk management policy. This abstract is part of a PhD research that looks into barriers to flood management under a changing climate and intends to identify bottlenecks that create maladaptation. Experiences are drawn from international practices in the UK and examined in the context of Australia through exploring the flood governance in a highly flood-prone region in Australia: the Hawkesbury Ne-pean catchment as a case study. In this research study several aspects of governance and management are explored: (i) the complexities created by the way different agencies are involved in assessing flood risks (ii) different perceptions on acceptable flood risk level; (iii) perceptions on community engagement in defining acceptable flood risk level; (iv) Views on a holistic flood risk management approach; and, (v) challenges of centralised information system. The study concludes that the complexity of managing a large catchment is exacerbated by the difference in the way professionals perceive the problem. This has led to: (a) different standards for acceptable risks; (b) inconsistent attempt to set-up a regional scale flood management plan beyond the jurisdictional boundaries: (c) absence of a regional scale agency with license to share and update information (d) Lack of forums for dialogue with insurance companies to ensure an integrated approach to flood management. The research takes the Hawkesbury-Nepean catchment as case example and draws from literary evidence from around the world. In addition, conclusions were extrapolated from eighteen semi-structured interviews from agencies involved in flood risk management in the Hawkesbury-Nepean catchment of NSW, Australia. The outcome of this research is to provide a better understanding of complexity in assessing risks against a rapidly changing climate and contribute towards developing effective risk communication strategies thus enabling better management of floods and achieving increased level of support from insurance companies, real-estate agencies, state and regional risk managers and the affected communities.Keywords: adaptive governance, flood management, flood risk communication, stakeholder risk perceptions
Procedia PDF Downloads 286122 Pre-Cancerigene Injuries Related to Human Papillomavirus: Importance of Cervicography as a Complementary Diagnosis Method
Authors: Denise De Fátima Fernandes Barbosa, Tyane Mayara Ferreira Oliveira, Diego Jorge Maia Lima, Paula Renata Amorim Lessa, Ana Karina Bezerra Pinheiro, Cintia Gondim Pereira Calou, Glauberto Da Silva Quirino, Hellen Lívia Oliveira Catunda, Tatiana Gomes Guedes, Nicolau Da Costa
Abstract:
The aim of this study is to evaluate the use of Digital Cervicography (DC) in the diagnosis of precancerous lesions related to Human Papillomavirus (HPV). Cross-sectional study with a quantitative approach, of evaluative type, held in a health unit linked to the Pro Dean of Extension of the Federal University of Ceará, in the period of July to August 2015 with a sample of 33 women. Data collecting was conducted through interviews with enforcement tool. Franco (2005) standardized the technique used for DC. Polymerase Chain Reaction (PCR) was performed to identify high-risk HPV genotypes. DC were evaluated and classified by 3 judges. The results of DC and PCR were classified as positive, negative or inconclusive. The data of the collecting instruments were compiled and analyzed by the software Statistical Package for Social Sciences (SPSS) with descriptive statistics and cross-references. Sociodemographic, sexual and reproductive variables were analyzed through absolute frequencies (N) and their respective percentage (%). Kappa coefficient (κ) was applied to determine the existence of agreement between the DC of reports among evaluators with PCR and also among the judges about the DC results. The Pearson's chi-square test was used for analysis of sociodemographic, sexual and reproductive variables with the PCR reports. It was considered statistically significant (p<0.05). Ethical aspects of research involving human beings were respected, according to 466/2012 Resolution. Regarding the socio-demographic profile, the most prevalent ages and equally were those belonging to the groups 21-30 and 41-50 years old (24.2%). The brown color was reported in excess (84.8%) and 96.9% out of them had completed primary and secondary school or studying. 51.5% were married, 72.7% Catholic, 54.5% employed and 48.5% with income between one and two minimum wages. As for the sexual and reproductive characteristics, prevailed heterosexual (93.9%) who did not use condoms during sexual intercourse (72.7%). 51.5% had a previous history of Sexually Transmitted Infection (STI), and HPV the most prevalent STI (76.5%). 57.6% did not use contraception, 78.8% underwent examination Cancer Prevention Uterus (PCCU) with shorter time interval or equal to one year, 72.7% had no cases of Cervical Cancer in the family, 63.6% were multiparous and 97% were not vaccinated against HPV. DC identified good level of agreement between raters (κ=0.542), had a specificity of 77.8% and sensitivity of 25% when compared their results with PCR. Only the variable race showed a statistically significant association with CRP (p=0.042). DC had 100% acceptance amongst women in the sample, revealing the possibility of other experiments in using this method so that it proves as a viable technique. The DC positivity criteria were developed by nurses and these professionals also perform PCCU in Brazil, which means that DC can be an important complementary diagnostic method for the appreciation of these professional’s quality of examinations.Keywords: gynecological examination, human papillomavirus, nursing, papillomavirus infections, uterine lasmsneop
Procedia PDF Downloads 300121 Whistleblowing a Contemporary Topic Concerning Businesses
Authors: Andreas Kapardis, Maria Krambia-Kapardis, Sofia Michaelides-Mateou
Abstract:
Corruption and economic crime is a serious problem affecting the sustainability of businesses in the 21st century. Nowadays, many corruption or fraud cases come to light thanks to whistleblowers. This article will first discuss the concept of whistleblowing as well as some relevant legislation enacted around the world. Secondly, it will discuss the findings of a survey of whistleblowers or could-have-been whistleblowers. Finally, suggestions for the development of a comprehensive whistleblowing framework will be considered. Whistleblowing can be described as expressing a concern about a wrongdoing within an organization, such as a corporation, an association, an institution or a union. Such concern must be in the public interest and in good faith and should relate to the cover up of matters that could potentially result in a miscarriage of justice, a crime, criminal offence and threats to health and safety. Whistleblowing has proven to be an effective anti-corruption mechanism and a powerful tool that helps deterring fraud, violations, and malpractices within organizations, corporations and the public sector. Research in the field of whistleblowing has concentrated on the reasons for whistleblowing and financial bounties; the effectiveness of whistleblowing; whistleblowing being a prosocial behavior with a psychological perspective and consequences; as a tool in protecting shareholders, saving lives and billions of dollars of public funds. Whilst, no other study of whistleblowing has been carried out on whistleblowers or intended whistleblowers. The study reported in the current paper analyses the findings of 74 whistleblowers or intended whistleblowers, the reasons behind their decision to blow the whistle, or not to proceed to blow the whistle and any regrets they may have had. In addition a profile of a whistleblower is developed concerning their age, gender, marital and family status and position in an organization. Lessons learned from the intended whistleblowers and in response to the questions if they would be willing to blow the whistle again show that enacting legislation to protect the whistleblower is not enough. Similarly, rewarding the whistleblower does not appear to provide the whistleblower with an incentive since the majority noted that “work ethics is more important than financial rewards”. We recommend the development of a comprehensive and holistic framework for the protection of the whistleblower and to ensure that remedial actions are immediately taken once a whistleblower comes forward. The suggested framework comprises (a) hard legislation in ensuring the whistleblowers follow certain principles when blowing the whistle and, in return, are protected for a period of 5 years from being fired, dismissed, bullied, harassed; (b) soft legislation in establishing an agency to firstly ensure psychological and legal advice is provided to the whistleblowers and secondly any required remedial action is immediately taken to avert the undesirable events reported by a whistleblower from occurring and, finally; (c) mechanisms to ensure the coordination of actions taken.Keywords: whistleblowing, business ethics, legislation, business
Procedia PDF Downloads 309120 Targeting Violent Extremist Narratives: Applying Network Targeting Techniques to the Communication Functions of Terrorist Groups
Authors: John Hardy
Abstract:
Over the last decade, the increasing utility of extremist narratives to the operational effectiveness of terrorist organizations has been evidenced by the proliferation of inspired or affiliated attacks across the world. Famous examples such as regional al-Qaeda affiliates and the self-styled “Islamic State” demonstrate the effectiveness of leveraging communication technologies to disseminate propaganda, recruit members, and orchestrate attacks. Terrorist organizations with the capacity to harness the communicative power offered by digital communication technologies and effective political narratives have held an advantage over their targets in recent years. Terrorists have leveraged the perceived legitimacy of grass-roots actors to appeal to a global audience of potential supporters and enemies alike, and have wielded a proficiency in profile-raising which remains unmatched by counter terrorism narratives around the world. In contrast, many attempts at propagating official counter-narratives have been received by target audiences as illegitimate, top-down and impersonally bureaucratic. However, the benefits provided by widespread communication and extremist narratives have come at an operational cost. Terrorist organizations now face a significant challenge in protecting their access to communications technologies and authority over the content they create and endorse. The dissemination of effective narratives has emerged as a core function of terrorist organizations with international reach via inspired or affiliated attacks. As such, it has become a critical function which can be targeted by intelligence and security forces. This study applies network targeting principles which have been used by coalition forces against a range of non-state actors in the Middle East and South Asia to the communicative function of terrorist organizations. This illustrates both a conceptual link between functional targeting and operational disruption in the abstract and a tangible impact on the operational effectiveness of terrorists by degrading communicative ability and legitimacy. Two case studies highlight the utility of applying functional targeting against terrorist organizations. The first case is the targeted killing of Anwar al-Awlaki, an al-Qaeda propagandist who crafted a permissive narrative and effective propaganda videos to attract recruits who committed inspired terrorist attacks in the US and overseas. The second is a series of operations against Islamic State propagandists in Syria, including the capture or deaths of a cadre of high profile Islamic State members, including Junaid Hussain, Abu Mohammad al-Adnani, Neil Prakash, and Rachid Kassim. The group of Islamic State propagandists were linked to a significant rise in affiliated and enabled terrorist attacks and were subsequently targeted by law enforcement and military agencies. In both cases, the disruption of communication between the terrorist organization and recruits degraded both communicative and operational functions. Effective functional targeting on member recruitment and operational tempo suggests that narratives are a critical function which can be leveraged against terrorist organizations. Further application of network targeting methods to terrorist narratives may enhance the efficacy of a range of counter terrorism techniques employed by security and intelligence agencies.Keywords: countering violent extremism, counter terrorism, intelligence, terrorism, violent extremism
Procedia PDF Downloads 291119 Generating Individualized Wildfire Risk Assessments Utilizing Multispectral Imagery and Geospatial Artificial Intelligence
Authors: Gus Calderon, Richard McCreight, Tammy Schwartz
Abstract:
Forensic analysis of community wildfire destruction in California has shown that reducing or removing flammable vegetation in proximity to buildings and structures is one of the most important wildfire defenses available to homeowners. State laws specify the requirements for homeowners to create and maintain defensible space around all structures. Unfortunately, this decades-long effort had limited success due to noncompliance and minimal enforcement. As a result, vulnerable communities continue to experience escalating human and economic costs along the wildland-urban interface (WUI). Quantifying vegetative fuels at both the community and parcel scale requires detailed imaging from an aircraft with remote sensing technology to reduce uncertainty. FireWatch has been delivering high spatial resolution (5” ground sample distance) wildfire hazard maps annually to the community of Rancho Santa Fe, CA, since 2019. FireWatch uses a multispectral imaging system mounted onboard an aircraft to create georeferenced orthomosaics and spectral vegetation index maps. Using proprietary algorithms, the vegetation type, condition, and proximity to structures are determined for 1,851 properties in the community. Secondary data processing combines object-based classification of vegetative fuels, assisted by machine learning, to prioritize mitigation strategies within the community. The remote sensing data for the 10 sq. mi. community is divided into parcels and sent to all homeowners in the form of defensible space maps and reports. Follow-up aerial surveys are performed annually using repeat station imaging of fixed GPS locations to address changes in defensible space, vegetation fuel cover, and condition over time. These maps and reports have increased wildfire awareness and mitigation efforts from 40% to over 85% among homeowners in Rancho Santa Fe. To assist homeowners fighting increasing insurance premiums and non-renewals, FireWatch has partnered with Black Swan Analytics, LLC, to leverage the multispectral imagery and increase homeowners’ understanding of wildfire risk drivers. For this study, a subsample of 100 parcels was selected to gain a comprehensive understanding of wildfire risk and the elements which can be mitigated. Geospatial data from FireWatch’s defensible space maps was combined with Black Swan’s patented approach using 39 other risk characteristics into a 4score Report. The 4score Report helps property owners understand risk sources and potential mitigation opportunities by assessing four categories of risk: Fuel sources, ignition sources, susceptibility to loss, and hazards to fire protection efforts (FISH). This study has shown that susceptibility to loss is the category residents and property owners must focus their efforts. The 4score Report also provides a tool to measure the impact of homeowner actions on risk levels over time. Resiliency is the only solution to breaking the cycle of community wildfire destruction and it starts with high-quality data and education.Keywords: defensible space, geospatial data, multispectral imaging, Rancho Santa Fe, susceptibility to loss, wildfire risk.
Procedia PDF Downloads 108118 An Integrated Framework for Wind-Wave Study in Lakes
Authors: Moien Mojabi, Aurelien Hospital, Daniel Potts, Chris Young, Albert Leung
Abstract:
The wave analysis is an integral part of the hydrotechnical assessment carried out during the permitting and design phases for coastal structures, such as marinas. This analysis aims in quantifying: i) the Suitability of the coastal structure design against Small Craft Harbour wave tranquility safety criterion; ii) Potential environmental impacts of the structure (e.g., effect on wave, flow, and sediment transport); iii) Mooring and dock design and iv) Requirements set by regulatory agency’s (e.g., WSA section 11 application). While a complex three-dimensional hydrodynamic modelling approach can be applied on large-scale projects, the need for an efficient and reliable wave analysis method suitable for smaller scale marina projects was identified. As a result, Tetra Tech has developed and applied an integrated analysis framework (hereafter TT approach), which takes the advantage of the state-of-the-art numerical models while preserving the level of simplicity that fits smaller scale projects. The present paper aims to describe the TT approach and highlight the key advantages of using this integrated framework in lake marina projects. The core of this methodology is made by integrating wind, water level, bathymetry, and structure geometry data. To respond to the needs of specific projects, several add-on modules have been added to the core of the TT approach. The main advantages of this method over the simplified analytical approaches are i) Accounting for the proper physics of the lake through the modelling of the entire lake (capturing real lake geometry) instead of a simplified fetch approach; ii) Providing a more realistic representation of the waves by modelling random waves instead of monochromatic waves; iii) Modelling wave-structure interaction (e.g. wave transmission/reflection application for floating structures and piles amongst others); iv) Accounting for wave interaction with the lakebed (e.g. bottom friction, refraction, and breaking); v) Providing the inputs for flow and sediment transport assessment at the project site; vi) Taking in consideration historical and geographical variations of the wind field; and vii) Independence of the scale of the reservoir under study. Overall, in comparison with simplified analytical approaches, this integrated framework provides a more realistic and reliable estimation of wave parameters (and its spatial distribution) in lake marinas, leading to a realistic hydrotechnical assessment accessible to any project size, from the development of a new marina to marina expansion and pile replacement. Tetra Tech has successfully utilized this approach since many years in the Okanagan area.Keywords: wave modelling, wind-wave, extreme value analysis, marina
Procedia PDF Downloads 84117 Skull Extraction for Quantification of Brain Volume in Magnetic Resonance Imaging of Multiple Sclerosis Patients
Authors: Marcela De Oliveira, Marina P. Da Silva, Fernando C. G. Da Rocha, Jorge M. Santos, Jaime S. Cardoso, Paulo N. Lisboa-Filho
Abstract:
Multiple Sclerosis (MS) is an immune-mediated disease of the central nervous system characterized by neurodegeneration, inflammation, demyelination, and axonal loss. Magnetic resonance imaging (MRI), due to the richness in the information details provided, is the gold standard exam for diagnosis and follow-up of neurodegenerative diseases, such as MS. Brain atrophy, the gradual loss of brain volume, is quite extensive in multiple sclerosis, nearly 0.5-1.35% per year, far off the limits of normal aging. Thus, the brain volume quantification becomes an essential task for future analysis of the occurrence atrophy. The analysis of MRI has become a tedious and complex task for clinicians, who have to manually extract important information. This manual analysis is prone to errors and is time consuming due to various intra- and inter-operator variability. Nowadays, computerized methods for MRI segmentation have been extensively used to assist doctors in quantitative analyzes for disease diagnosis and monitoring. Thus, the purpose of this work was to evaluate the brain volume in MRI of MS patients. We used MRI scans with 30 slices of the five patients diagnosed with multiple sclerosis according to the McDonald criteria. The computational methods for the analysis of images were carried out in two steps: segmentation of the brain and brain volume quantification. The first image processing step was to perform brain extraction by skull stripping from the original image. In the skull stripper for MRI images of the brain, the algorithm registers a grayscale atlas image to the grayscale patient image. The associated brain mask is propagated using the registration transformation. Then this mask is eroded and used for a refined brain extraction based on level-sets (edge of the brain-skull border with dedicated expansion, curvature, and advection terms). In the second step, the brain volume quantification was performed by counting the voxels belonging to the segmentation mask and converted in cc. We observed an average brain volume of 1469.5 cc. We concluded that the automatic method applied in this work can be used for the brain extraction process and brain volume quantification in MRI. The development and use of computer programs can contribute to assist health professionals in the diagnosis and monitoring of patients with neurodegenerative diseases. In future works, we expect to implement more automated methods for the assessment of cerebral atrophy and brain lesions quantification, including machine-learning approaches. Acknowledgements: This work was supported by a grant from Brazilian agency Fundação de Amparo à Pesquisa do Estado de São Paulo (number 2019/16362-5).Keywords: brain volume, magnetic resonance imaging, multiple sclerosis, skull stripper
Procedia PDF Downloads 146116 Prospects of Agroforestry Products in the Emergency Situation: A Case Study of Earthquake of 2015 in Central Nepal
Authors: Raju Chhetri
Abstract:
Agroforestry is one of the main sources of livelihood among the people of Nepal. In particular, this is the only one mode of livelihood among the Chepangs. The monster earthquake (7.3 MW) that hit the country on the 25th of April in 2015 and many of its aftershocks had devastating effects. As a result, not only the big structures collapsed, it incurred great losses on fabrication, collection centers, schools, markets and other necessary service centers. Although there were a large number of aftershocks after the monster earthquake, the most devastating aftershock took place on 12th May, 2015, which measured 6.3 richter scale. Consequently, it caused more destruction of houses, further calamity to the lives of people, and public life got further perdition. This study was mainly carried out to find out the food security and market situation of Agroforestry product of the Chepang community in Raksirang VDC (one of the severely affected VDCs of Makwanpur district) due to the earthquake. A total of 40 households (12 percent) were randomly selected as a sample in ward number 7 only. Questionnaires and focus groups were used to gather primary data. Additional, two Focus Group Discussions (FGD) were convened in the study area to get some descriptive information on this study. Estimated 370 hectares of land, which was full of Agroforestry plantation, ruptured by the earthquake. It caused severe damages to the households, and a serious loss of food-stock, up to 60-80 percent (maize, millet, and rice). Instead of regular cereal intake, banana (Muas Paradisca) consumption was found ‘high scale’ in the emergency period. The market price of rice (37-44 NRS/Kg) increased by 18.9 percent. Some difference in the income range before and after the earthquake was observed. Before earthquake, sale of Agroforestry, and livestock products were continuing, but after the earthquake, Agroforestry product sale is the only one means of livelihood among Chepangs. Nearly 50-60 percent Agroforestry production of banana (Mass Paradisca), citrus (Citrus Lemon), pineapple (Ananus comosus) and broom grass (Thysanolaena maxima) declined, excepting for cash income from the residual. Heavy demands of Agroforestry product mentioned above lay high farm gate prices (50-100 percent) helps surveyed the community to continue livelihood from its sale. Out of the survey samples, 30 households (75 percent) respondents migrated to safe location due to land rupture, ongoing aftershocks, and landslides. Overall food security situation in this community is acute and challenging for the days to come. Immediate and long term both response from a relief agency concerning food, shelter and safe stocking of Agroforestry product is required to keep secured livelihood in Chepang community.Keywords: earthquake, rupture, agroforestry, livelihood, indigenous, food security
Procedia PDF Downloads 322115 From Clients to Colleagues: Supporting the Professional Development of Survivor Social Work Students
Authors: Stephanie Jo Marchese
Abstract:
This oral presentation is a reflective piece regarding current social work teaching methods that value and devalue the lived experiences of survivor students. This presentation grounds the term ‘survivor’ in feminist frameworks. A survivor-defined approach to feminist advocacy assumes an individual’s agency, considers each case and needs independent of generalizations, and provides resources and support to empower victims. Feminist ideologies are ripe arenas to update and influence the rapport-building schools of social work have with these students. Survivor-based frameworks are rooted in nuanced understandings of intersectional realities, staunchly combat both conscious and unconscious deficit lenses wielded against victims, elevate lived experiences to the realm of experiential expertise, and offer alternatives to traditional power structures and knowledge exchanges. Actively importing a survivor framework into the methodology of social work teaching breaks open barriers many survivor students have faced in institutional settings, this author included. The profession of social work is at an important crux of change, both in the United States and globally. The United States is currently undergoing a radical change in its citizenry and outlier communities have taken to the streets again in opposition to their othered-ness. New waves of students are entering this field, emboldened by their survival of personal and systemic oppressions- heavily influenced by third-wave feminism, critical race theory, queer theory, among other post-structuralist ideologies. Traditional models of sociological and psychological studies are actively being challenged. The profession of social work was not founded on the diagnosis of disorders but rather a grassroots-level activism that heralded and demanded resources for oppressed communities. Institutional and classroom acceptance and celebration of survivor narratives can catapult the resurgence of these values needed in the profession’s service-delivery models and put social workers back in the driver's seat of social change (a combined advocacy and policy perspective), moving away from outsider-based intervention models. Survivor students should be viewed as agents of change, not solely former victims and clients. The ideas of this presentation proposal are supported through various qualitative interviews, as well as reviews of ‘best practices’ in the field of education that incorporate feminist methods of inclusion and empowerment. Curriculum and policy recommendations are also offered.Keywords: deficit lens bias, empowerment theory, feminist praxis, inclusive teaching models, strengths-based approaches, social work teaching methods
Procedia PDF Downloads 289114 Stochastic Nuisance Flood Risk for Coastal Areas
Authors: Eva L. Suarez, Daniel E. Meeroff, Yan Yong
Abstract:
The U.S. Federal Emergency Management Agency (FEMA) developed flood maps based on experts’ experience and estimates of the probability of flooding. Current flood-risk models evaluate flood risk with regional and subjective measures without impact from torrential rain and nuisance flooding at the neighborhood level. Nuisance flooding occurs in small areas in the community, where a few streets or blocks are routinely impacted. This type of flooding event occurs when torrential rainstorm combined with high tide and sea level rise temporarily exceeds a given threshold. In South Florida, this threshold is 1.7 ft above Mean Higher High Water (MHHW). The National Weather Service defines torrential rain as rain deposition at a rate greater than 0.3-inches per hour or three inches in a single day. Data from the Florida Climate Center, 1970 to 2020, shows 371 events with more than 3-inches of rain in a day in 612 months. The purpose of this research is to develop a data-driven method to determine comprehensive analytical damage-avoidance criteria that account for nuisance flood events at the single-family home level. The method developed uses the Failure Mode and Effect Analysis (FMEA) method from the American Society of Quality (ASQ) to estimate the Damage Avoidance (DA) preparation for a 1-day 100-year storm. The Consequence of Nuisance Flooding (CoNF) is estimated from community mitigation efforts to prevent nuisance flooding damage. The Probability of Nuisance Flooding (PoNF) is derived from the frequency and duration of torrential rainfall causing delays and community disruptions to daily transportation, human illnesses, and property damage. Urbanization and population changes are related to the U.S. Census Bureau's annual population estimates. Data collected by the United States Department of Agriculture (USDA) Natural Resources Conservation Service’s National Resources Inventory (NRI) and locally by the South Florida Water Management District (SFWMD) track the development and land use/land cover changes with time. The intent is to include temporal trends in population density growth and the impact on land development. Results from this investigation provide the risk of nuisance flooding as a function of CoNF and PoNF for coastal areas of South Florida. The data-based criterion provides awareness to local municipalities on their flood-risk assessment and gives insight into flood management actions and watershed development.Keywords: flood risk, nuisance flooding, urban flooding, FMEA
Procedia PDF Downloads 99113 Characterization of Thin Woven Composites Used in Printed Circuit Boards by Combining Numerical and Experimental Approaches
Authors: Gautier Girard, Marion Martiny, Sebastien Mercier, Mohamad Jrad, Mohamed-Slim Bahi, Laurent Bodin, Francois Lechleiter, David Nevo, Sophie Dareys
Abstract:
Reliability of electronic devices has always been of highest interest for Aero-MIL and space applications. In any electronic device, Printed Circuit Board (PCB), providing interconnection between components, is a key for reliability. During the last decades, PCB technologies evolved to sustain and/or fulfill increased original equipment manufacturers requirements and specifications, higher densities and better performances, faster time to market and longer lifetime, newer material and mixed buildups. From the very beginning of the PCB industry up to recently, qualification, experiments and trials, and errors were the most popular methods to assess system (PCB) reliability. Nowadays OEM, PCB manufacturers and scientists are working together in a close relationship in order to develop predictive models for PCB reliability and lifetime. To achieve that goal, it is fundamental to characterize precisely base materials (laminates, electrolytic copper, …), in order to understand failure mechanisms and simulate PCB aging under environmental constraints by means of finite element method for example. The laminates are woven composites and have thus an orthotropic behaviour. The in-plane properties can be measured by combining classical uniaxial testing and digital image correlation. Nevertheless, the out-of-plane properties cannot be evaluated due to the thickness of the laminate (a few hundred of microns). It has to be noted that the knowledge of the out-of-plane properties is fundamental to investigate the lifetime of high density printed circuit boards. A homogenization method combining analytical and numerical approaches has been developed in order to obtain the complete elastic orthotropic behaviour of a woven composite from its precise 3D internal structure and its experimentally measured in-plane elastic properties. Since the mechanical properties of the resin surrounding the fibres are unknown, an inverse method is proposed to estimate it. The methodology has been applied to one laminate used in hyperfrequency spatial applications in order to get its elastic orthotropic behaviour at different temperatures in the range [-55°C; +125°C]. Next; numerical simulations of a plated through hole in a double sided PCB are performed. Results show the major importance of the out-of-plane properties and the temperature dependency of these properties on the lifetime of a printed circuit board. Acknowledgements—The support of the French ANR agency through the Labcom program ANR-14-LAB7-0003-01, support of CNES, Thales Alenia Space and Cimulec is acknowledged.Keywords: homogenization, orthotropic behaviour, printed circuit board, woven composites
Procedia PDF Downloads 204112 A Brazilian Study Applied to the Regulatory Environmental Issues of Nanomaterials
Authors: Luciana S. Almeida
Abstract:
Nanotechnology has revolutionized the world of science and technology bringing great expectations due to its great potential of application in the most varied industrial sectors. The same characteristics that make nanoparticles interesting from the point of view of the technological application, these may be undesirable when released into the environment. The small size of nanoparticles facilitates their diffusion and transport in the atmosphere, water, and soil and facilitates the entry and accumulation of nanoparticles in living cells. The main objective of this study is to evaluate the environmental regulatory process of nanomaterials in the Brazilian scenario. Three specific objectives were outlined. The first is to carry out a global scientometric study, in a research platform, with the purpose of identifying the main lines of study of nanomaterials in the environmental area. The second is to verify how environmental agencies in other countries have been working on this issue by means of a bibliographic review. And the third is to carry out an assessment of the Brazilian Nanotechnology Draft Law 6741/2013 with the state environmental agencies. This last one has the aim of identifying the knowledge of the subject by the environmental agencies and necessary resources available in the country for the implementation of the Policy. A questionnaire will be used as a tool for this evaluation to identify the operational elements and build indicators through the Environment of Evaluation Application, a computational application developed for the development of questionnaires. At the end will be verified the need to propose changes in the Draft Law of the National Nanotechnology Policy. Initial studies, in relation to the first specific objective, have already identified that Brazil stands out in the production of scientific publications in the area of nanotechnology, although the minority is in studies focused on environmental impact studies. Regarding the general panorama of other countries, some findings have also been raised. The United States has included the nanoform of the substances in an existing program in the EPA (Environmental Protection Agency), the TSCA (Toxic Substances Control Act). The European Union issued a draft of a document amending Regulation 1907/2006 of the European Parliament and Council to cover the nanoform of substances. Both programs are based on the study and identification of environmental risks associated with nanomaterials taking into consideration the product life cycle. In relation to Brazil, regarding the third specific objective, it is notable that the country does not have any regulations applicable to nanostructures, although there is a Draft Law in progress. In this document, it is possible to identify some requirements related to the environment, such as environmental inspection and licensing; industrial waste management; notification of accidents and application of sanctions. However, it is not known if these requirements are sufficient for the prevention of environmental impacts and if national environmental agencies will know how to apply them correctly. This study intends to serve as a basis for future actions regarding environmental management applied to the use of nanotechnology in Brazil.Keywords: environment; management; nanotecnology; politics
Procedia PDF Downloads 122111 Processes Controlling Release of Phosphorus (P) from Catchment Soils and the Relationship between Total Phosphorus (TP) and Humic Substances (HS) in Scottish Loch Waters
Authors: Xiaoyun Hui, Fiona Gentle, Clemens Engelke, Margaret C. Graham
Abstract:
Although past work has shown that phosphorus (P), an important nutrient, may form complexes with aqueous humic substances (HS), the principal component of natural organic matter, the nature of such interactions is poorly understood. Humic complexation may not only enhance P concentrations but it may change its bioavailability within such waters and, in addition, influence its transport within catchment settings. This project is examining the relationships and associations of P, HS, and iron (Fe) in Loch Meadie, Sutherland, North Scotland, a mesohumic freshwater loch which has been assessed as reference condition with respect to P. The aim is to identify characteristic spectroscopic parameters which can enhance the performance of the model currently used to predict reference condition TP levels for highly-coloured Scottish lochs under the Water Framework Directive. In addition to Loch Meadie, samples from other reference condition lochs in north Scotland and Shetland were analysed. By including different types of reference condition lochs (clear water, mesohumic and polyhumic water) this allowed the relationship between total phosphorus (TP) and HS to be more fully explored. The pH, [TP], [Fe], UV/Vis absorbance/spectra, [TOC] and [DOC] for loch water samples have been obtained using accredited methods. Loch waters were neutral to slightly acidic/alkaline (pH 6-8). [TP] in loch waters were lower than 50 µg L-1, and in Loch Meadie waters were typically <10 µg L-1. [Fe] in loch waters were mainly <0.6 mg L-1, but for some loch water samples, [Fe] were in the range 1.0-1.8 mg L-1and there was a positive correlation with [TOC] (r2=0.61). Lochs were classified as clear water, mesohumic or polyhumic based on water colour. The range of colour values of sampled lochs in each category were 0.2–0.3, 0.2–0.5 and 0.5–0.8 a.u. (10 mm pathlength), respectively. There was also a strong positive correlation between [DOC] and water colour (R2=0.84). The UV/Vis spectra (200-700 nm) for water samples were featureless with only a slight “shoulder” observed in the 270–290 nm region. Ultrafiltration was then used to separate colloidal and truly dissolved components from the loch waters and, since it contained the majority of aqueous P and Fe, the colloidal component was fractionated by gel filtration chromatography method. Gel filtration chromatographic fractionation of the colloids revealed two brown-coloured bands which had distinctive UV/Vis spectral features. The first eluting band had larger and more aromatic HS molecules than the second band, and in addition both P and Fe were primarily associated with the larger, more aromatic HS. This result demonstrated that P was able to form complexes with Fe-rich components of HS, and thus provided a scientific basis for the significant correlation between [Fe] and [TP] that the previous monitoring data of reference condition lochs from Scottish Environment Protection Agency (SEPA) showed. The distinctive features of the HS will be used as the basis for an improved spectroscopic tool.Keywords: total phosphorus, humic substances, Scottish loch water, WFD model
Procedia PDF Downloads 546110 Alternative Fuel Production from Sewage Sludge
Authors: Jaroslav Knapek, Kamila Vavrova, Tomas Kralik, Tereza Humesova
Abstract:
The treatment and disposal of sewage sludge is one of the most important and critical problems of waste water treatment plants. Currently, 180 thousand tonnes of sludge dry matter are produced in the Czech Republic, which corresponds to approximately 17.8 kg of stabilized sludge dry matter / year per inhabitant of the Czech Republic. Due to the fact that sewage sludge contains a large amount of substances that are not beneficial for human health, the conditions for sludge management will be significantly tightened in the Czech Republic since 2023. One of the tested methods of sludge liquidation is the production of alternative fuel from sludge from sewage treatment plants and paper production. The paper presents an analysis of economic efficiency of alternative fuel production from sludge and its use for fluidized bed boiler with nominal consumption of 5 t of fuel per hour. The evaluation methodology includes the entire logistics chain from sludge extraction, through mechanical moisture reduction to about 40%, transport to the pelletizing line, moisture drying for pelleting and pelleting itself. For economic analysis of sludge pellet production, a time horizon of 10 years corresponding to the expected lifetime of the critical components of the pelletizing line is chosen. The economic analysis of pelleting projects is based on a detailed analysis of reference pelleting technologies suitable for sludge pelleting. The analysis of the economic efficiency of pellet is based on the simulation of cash flows associated with the implementation of the project over the life of the project. For the entered value of return on the invested capital, the price of the resulting product (in EUR / GJ or in EUR / t) is searched to ensure that the net present value of the project is zero over the project lifetime. The investor then realizes the return on the investment in the amount of the discount used to calculate the net present value. The calculations take place in a real business environment (taxes, tax depreciation, inflation, etc.) and the inputs work with market prices. At the same time, the opportunity cost principle is respected; waste disposal for alternative fuels includes the saved costs of waste disposal. The methodology also respects the emission allowances saved due to the displacement of coal by alternative (bio) fuel. Preliminary results of testing of pellet production from sludge show that after suitable modifications of the pelletizer it is possible to produce sufficiently high quality pellets from sludge. A mixture of sludge and paper waste has proved to be a more suitable material for pelleting. At the same time, preliminary results of the analysis of the economic efficiency of this sludge disposal method show that, despite the relatively low calorific value of the fuel produced (about 10-11 MJ / kg), this sludge disposal method is economically competitive. This work has been supported by the Czech Technology Agency within the project TN01000048 Biorefining as circulation technology.Keywords: Alternative fuel, Economic analysis, Pelleting, Sewage sludge
Procedia PDF Downloads 135109 The Politics of Identity and Retributive Genocidal Massacre against Chena Amhara under International Humanitarian Law
Authors: Gashaw Sisay Zenebe
Abstract:
Northern-Ethiopian conflict that broke out on 04 November 2020 between the central government and TPLF caused destruction beyond imagination in all aspects; millions of people have been killed, including civilians, mainly women, and children. Civilians have been indiscriminately attacked simply because of their ethnic or religious identity. Warrying parties committed serious crimes of international concern opposite to International Humanitarian Law (IHL). A House of People Representatives (HPR) declared that the terrorist Tigrean Defense Force (TDF), encompassing all segments of its people, waged war against North Gondar through human flooding. On Aug 30, 2021, after midnight, TDF launched a surprise attack against Chena People who had been drunk and deep slept due to the annual festivity. Unlike the lowlands, however, ENDF conjoined the local people to fight TDF in these Highland areas. This research examines identity politics and the consequential genocidal massacre of Chena, including its human and physical destructions that occurred as a result of the armed conflict. As such, the study could benefit international entities by helping them develop a better understanding of what happened in Chena and trigger interest in engaging in ensuring the accountability and enforcement of IHL in the future. Preserving fresh evidence will also serve as a starting point on the road to achieving justice either nationally or internationally. To study the Chena case evaluated against IHL rules, a combination of qualitative and doctrinal research methodology has been employed. The study basically follows a unique sampling case study which has used primary data tools such as observation, interview, key-informant interview, FGD, and battle-field notes. To supplement, however, secondary sources, including books, journal articles, domestic laws, international conventions, reports, and media broadcasts, were used to give meaning to what happened on the ground in light of international law. The study proved that the war was taking place to separate Tigray from Ethiopia. While undertaking military operations to achieve this goal, mass killings, genocidal acts, and war crimes were committed over Chena and approximate sites in the Dabat district of North Gondar. Thus, hundreds of people lost their lives to the brutalities of mass killings, hundreds of people were subjected to a forcible disappearance, and tens of thousands of people were forced into displacement. Furthermore, harsh beatings, forced labor, slavery, torture, rape, and gang rape have been reported, and generally, people are subjected to pass cruel, inhuman, and degrading treatment and punishment. Also, what is so unique is that animals were indiscriminately killed completely, making the environment unsafe for human survival because of pollution and bad smells and the consequent diseases such as Cholera, Flu, and Diarrhea. In addition to TDF, ENDF’s shelling has caused destruction to farmers’ houses & claimed lives. According to humanitarian principles, acts that can establish MACs and war crimes were perpetrated. Generally, the war in this direction has shown an absolute disrespect for international law norms.Keywords: genocide, war crimes, Tigray Defense Force, Chena, IHL
Procedia PDF Downloads 71108 Spatial Accessibility Analysis of Kabul City Public Transport
Authors: Mohammad Idrees Yusofzai, Hirobata Yasuhiro, Matsuo Kojiro
Abstract:
Kabul is the capital of Afghanistan. It is the focal point of educational, industrial, etc. of Afghanistan. Additionally, the population of Kabul has grown recently and will increase because of return of refugees and shifting of people from other province to Kabul city. However, this increase in population, the issues of urban congestion and other related problems of urban transportation in Kabul city arises. One of the problems is public transport (large buses) service and needs to be modified and enhanced especially large bus routes that are operating in each zone of the 22 zone of Kabul City. To achieve the above mentioned goal of improving public transport, Spatial Accessibility Analysis is one of the important attributes to assess the effectiveness of transportation system and urban transport policy of a city, because accessibility indicator as an alternative tool to support public policy that aims the reinforcement of sustainable urban space. The case study of this research compares the present model (present bus route) and the modified model of public transport. Furthermore, present model, the bus routes in most of the zones are active, however, with having low frequency and unpublished schedule, and accessibility result is analyzed in four cases, based on the variables of accessibility. Whereas in modified model all zones in Kabul is taken into consideration with having specified origin and high frequency. Indeed the number of frequencies is kept high; however, this number is based on the number of buses Millie Bus Enterprise Authority (MBEA) owns. The same approach of cases is applied in modified model to figure out the best accessibility for the modified model. Indeed, the modified model is having a positive impact in congestion level in Kabul city. Besides, analyses of person trip and trip distribution have been also analyzed because how people move in the study area by each mode of transportation. So, the general aims of this research are to assess the present movement of people, identify zones in need of public transport and assess equity level of accessibility in Kabul city. The framework of methodology used in this research is based on gravity analysis model of accessibility; besides, generalized cost (time) of travel and travel mode is calculated. The main data come from person trip survey, socio-economic characteristics, demographic data by Japan International Cooperation Agency, 2008, study of Kabul city and also from the previous researches on travel pattern and the remaining data regarding present bus line and routes have been from MBEA. In conclusion, this research explores zones where public transport accessibility level is high and where it is low. It was found that both models the downtown area or central zones of Kabul city is having high level accessibility. Besides, the present model is the most unfavorable compared with the modified model based on the accessibility analysis.Keywords: accessibility, bus generalized cost, gravity model, public transportation network
Procedia PDF Downloads 193107 A Conceptual Framework of Integrated Evaluation Methodology for Aquaculture Lakes
Authors: Robby Y. Tallar, Nikodemus L., Yuri S., Jian P. Suen
Abstract:
Research in the subject of ecological water resources management is full of trivial questions addressed and it seems, today to be one branch of science that can strongly contribute to the study of complexity (physical, biological, ecological, socio-economic, environmental, and other aspects). Existing literature available on different facets of these studies, much of it is technical and targeted for specific users. This study offered the combination all aspects in evaluation methodology for aquaculture lakes with its paradigm refer to hierarchical theory and to the effects of spatial specific arrangement of an object into a space or local area. Therefore, the process in developing a conceptual framework represents the more integrated and related applicable concept from the grounded theory. A design of integrated evaluation methodology for aquaculture lakes is presented. The method is based on the identification of a series of attributes which can be used to describe status of aquaculture lakes using certain indicators from aquaculture water quality index (AWQI), aesthetic aquaculture lake index (AALI) and rapid appraisal for fisheries index (RAPFISH). The preliminary preparation could be accomplished as follows: first, the characterization of study area was undertaken at different spatial scales. Second, an inventory data as a core resource such as city master plan, water quality reports from environmental agency, and related government regulations. Third, ground-checking survey should be completed to validate the on-site condition of study area. In order to design an integrated evaluation methodology for aquaculture lakes, finally we integrated and developed rating scores system which called Integrated Aquaculture Lake Index (IALI).The development of IALI are reflecting a compromise all aspects and it responds the needs of concise information about the current status of aquaculture lakes by the comprehensive approach. IALI was elaborated as a decision aid tool for stakeholders to evaluate the impact and contribution of anthropogenic activities on the aquaculture lake’s environment. The conclusion was while there is no denying the fact that the aquaculture lakes are under great threat from the pressure of the increasing human activities, one must realize that no evaluation methodology for aquaculture lakes can succeed by keeping the pristine condition. The IALI developed in this work can be used as an effective, low-cost evaluation methodology of aquaculture lakes for developing countries. Because IALI emphasizes the simplicity and understandability as it must communicate to decision makers and the experts. Moreover, stakeholders need to be helped to perceive their lakes so that sites can be accepted and valued by local people. For this site of lake development, accessibility and planning designation of the site is of decisive importance: the local people want to know whether the lake condition is safe or whether it can be used.Keywords: aesthetic value, AHP, aquaculture lakes, integrated lakes, RAPFISH
Procedia PDF Downloads 237106 Recovery of Food Waste: Production of Dog Food
Authors: K. Nazan Turhan, Tuğçe Ersan
Abstract:
The population of the world is approximately 8 billion, and it increases uncontrollably and irrepressibly, leading to an increase in consumption. This situation causes crucial problems, and food waste is one of these. The Food and Agriculture Organization of the United Nations (FAO) defines food waste as the discarding or alternative utilization of food that is safe and nutritious for the consumption of humans along the entire food supply chain, from primary production to end household consumer level. In addition, according to the estimation of FAO, one-third of all food produced for human consumption is lost or wasted worldwide every year. Wasting food endangers natural resources and causes hunger. For instance, excessive amounts of food waste cause greenhouse gas emissions, contributing to global warming. Therefore, waste management has been gaining significance in the last few decades at both local and global levels due to the expected scarcity of resources for the increasing population of the world. There are several ways to recover food waste. According to the United States Environmental Protection Agency’s Food Recovery Hierarchy, food waste recovery ways are source reduction, feeding hungry people, feeding animals, industrial uses, composting, and landfill/incineration from the most preferred to the least preferred, respectively. Bioethanol, biodiesel, biogas, agricultural fertilizer and animal feed can be obtained from food waste that is generated by different food industries. In this project, feeding animals was selected as a food waste recovery method and food waste of a plant was used to provide ingredient uniformity. Grasshoppers were used as a protein source. In other words, the project was performed to develop a dog food product by recovery of the plant’s food waste after following some steps. The collected food waste and purchased grasshoppers were sterilized, dried and pulverized. Then, they were all mixed with 60 g agar-agar solution (4%w/v). 3 different aromas were added, separately to the samples to enhance flavour quality. Since there are differences in the required amounts of different species of dogs, fulfilling all nutritional needs is one of the problems. In other words, there is a wide range of nutritional needs in terms of carbohydrates, protein, fat, sodium, calcium, and so on. Furthermore, the requirements differ depending on age, gender, weight, height, and species. Therefore, the product that was developed contains average amounts of each substance so as not to cause any deficiency or surplus. On the other hand, it contains more protein than similar products in the market. The product was evaluated in terms of contamination and nutritional content. For contamination risk, detection of E. coli and Salmonella experiments were performed, and the results were negative. For the nutritional value test, protein content analysis was done. The protein contents of different samples vary between 33.68% and 26.07%. In addition, water activity analysis was performed, and the water activity (aw) values of different samples ranged between 0.2456 and 0.4145.Keywords: food waste, dog food, animal nutrition, food waste recovery
Procedia PDF Downloads 63105 Technology Management for Early Stage Technologies
Authors: Ming Zhou, Taeho Park
Abstract:
Early stage technologies have been particularly challenging to manage due to high degrees of their numerous uncertainties. Most research results directly out of a research lab tend to be at their early, if not the infant stage. A long while uncertain commercialization process awaits these lab results. The majority of such lab technologies go nowhere and never get commercialized due to various reasons. Any efforts or financial resources put into managing these technologies turn fruitless. High stake naturally calls for better results, which make a patenting decision harder to make. A good and well protected patent goes a long way for commercialization of the technology. Our preliminary research showed that there was not a simple yet productive procedure for such valuation. Most of the studies now have been theoretical and overly comprehensive where practical suggestions were non-existent. Hence, we attempted to develop a simple and highly implementable procedure for efficient and scalable valuation. We thoroughly reviewed existing research, interviewed practitioners in the Silicon Valley area, and surveyed university technology offices. Instead of presenting another theoretical and exhaustive research, we aimed at developing a practical guidance that a government agency and/or university office could easily deploy and get things moving to later steps of managing early stage technologies. We provided a procedure to thriftily value and make the patenting decision. A patenting index was developed using survey data and expert opinions. We identified the most important factors to be used in the patenting decision using survey ratings. The rating then assisted us in generating good relative weights for the later scoring and weighted averaging step. More importantly, we validated our procedure by testing it with our practitioner contacts. Their inputs produced a general yet highly practical cut schedule. Such schedule of realistic practices has yet to be witnessed our current research. Although a technology office may choose to deviate from our cuts, what we offered here at least provided a simple and meaningful starting point. This procedure was welcomed by practitioners in our expert panel and university officers in our interview group. This research contributed to our current understanding and practices of managing early stage technologies by instating a heuristically simple yet theoretical solid method for the patenting decision. Our findings generated top decision factors, decision processes and decision thresholds of key parameters. This research offered a more practical perspective which further completed our extant knowledge. Our results could be impacted by our sample size and even biased a bit by our focus on the Silicon Valley area. Future research, blessed with bigger data size and more insights, may want to further train and validate our parameter values in order to obtain more consistent results and analyze our decision factors for different industries.Keywords: technology management, early stage technology, patent, decision
Procedia PDF Downloads 342