Search results for: forward modeling
1802 The Effect of Information Technology on the Quality of Accounting Information
Authors: Mohammad Hadi Khorashadi Zadeh, Amin Karkon, Hamid Golnari
Abstract:
This study aimed to investigate the impact of information technology on the quality of accounting information was made in 2014. A survey of 425 executives of listed companies in Tehran Stock Exchange, using the Cochran formula simple random sampling method, 84 managers of these companies as the sample size was considered. Methods of data collection based on questionnaire information technology some of the questions of the impact of information technology was standardized questionnaires and the questions were designed according to existing components. After the distribution and collection of questionnaires, data analysis and hypothesis testing using structural equation modeling Smart PLS2 and software measurement model and the structure was conducted in two parts. In the first part of the questionnaire technical characteristics including reliability, validity, convergent and divergent validity for PLS has been checked and in the second part, application no significant coefficients were used to examine the research hypotheses. The results showed that IT and its dimensions (timeliness, relevance, accuracy, adequacy, and the actual transfer rate) affect the quality of accounting information of listed companies in Tehran Stock Exchange influence.Keywords: information technology, information quality, accounting, transfer speed
Procedia PDF Downloads 2771801 Modeling of Surge Corona Using Type94 in Overhead Power Lines
Authors: Zahira Anane, Abdelhafid Bayadi
Abstract:
Corona in the HV overhead transmission lines is an important source of attenuation and distortion of overvoltage surges. This phenomenon of distortion, which is superimposed on the distortion by skin effect, is due to the dissipation of energy by injection of space charges around the conductor, this process with place as soon as the instantaneous voltage exceeds the threshold voltage of the corona effect conductors. This paper presents a mathematical model to determine the corona inception voltage, the critical electric field and the corona radius, to predict the capacitive changes at conductor of transmission line due to corona. This model has been incorporated into the Alternative Transients Program version of the Electromagnetic Transients Program (ATP/EMTP) as a user defined component, using the MODELS interface with NORTON TYPE94 of this program and using the foreign subroutine. For obtained the displacement of corona charge hell, dichotomy mathematical method is used for this computation. The present corona model can be used for computing of distortion and attenuation of transient overvoltage waves being propagated in a transmission line of the very high voltage electric power.Keywords: high voltage, corona, Type94 NORTON, dichotomy, ATP/EMTP, MODELS, distortion, foreign model
Procedia PDF Downloads 6251800 Emoji, the Language of the Future: An Analysis of the Usage and Understanding of Emoji across User-Groups
Authors: Sakshi Bhalla
Abstract:
On the one hand, given their seemingly simplistic, near universal usage and understanding, emoji are discarded as a potential step back in the evolution of communication. On the other, their effectiveness, pervasiveness, and adaptability across and within contexts are undeniable. In this study, the responses of 40 people (categorized by age) were recorded based on a uniform two-part questionnaire where they were required to a) identify the meaning of 15 emoji when placed in isolation, and b) interpret the meaning of the same 15 emoji when placed in a context-defining posting on Twitter. Their responses were studied on the basis of deviation from their responses that identified the emoji in isolation, as well as the originally intended meaning ascribed to the emoji. Based on an analysis of these results, it was discovered that each of the five age categories uses, understands and perceives emoji differently, which could be attributed to the degree of exposure they have undergone. For example, in the case of the youngest category (aged < 20), it was observed that they were the least accurate at correctly identifying emoji in isolation (~55%). Further, their proclivity to change their response with respect to the context was also the least (~31%). However, an analysis of each of their individual responses showed that these first-borns of social media seem to have reached a point where emojis no longer inspire their most literal meanings to them. The meaning and implication of these emoji have evolved to imply their context-derived meanings, even when placed in isolation. These trends carry forward meaningfully for the other four groups as well. In the case of the oldest category (aged > 35), however, the trends indicated inaccuracy and therefore, a higher incidence of a proclivity to change their responses. When studied in a continuum, the responses indicate that slowly and steadily, emoji are evolving from pictograms to ideograms. That is to suggest that they do not just indicate a one-to-one relation between a singular form and singular meaning. In fact, they communicate increasingly complicated ideas. This is much like the evolution of ancient hieroglyphics on papyrus reed or cuneiform on Sumerian clay tablets, which evolved from simple pictograms to progressively more complex ideograms. This evolution within communication is parallel to and contingent on the simultaneous evolution of communication. What’s astounding is the capacity of humans to leverage different platforms to facilitate such changes. Twiterese, as it is now called, is one of the instances where language is adapting to the demands of the digital world. That it does not have a spoken component, an ostensible grammar, and lacks standardization of use and meaning, as some might suggest, may seem like impediments in qualifying it as the 'language' of the digital world. However, that kind of a declarative remains a function of time, and time alone.Keywords: communication, emoji, language, Twitter
Procedia PDF Downloads 951799 Unbreakable Obedience of Safety Regulation: The Study of Authoritarian Leadership and Safety Performance
Authors: Hong-Yi Kuo
Abstract:
Leadership is a key factor of improving workplace safety, and there have been abundant of studies which support the positive effects of appropriate leadership on employee safety performance in the western academic. However, little safety research focus on the Chinese leadership style like paternalistic leadership. To fill this gap, the resent study aims to examine the relationship between authoritarian leadership (one of the ternary mode in paternalistic leadership) and safety outcomes. This study makes hypothesis on different levels. First, on the group level, as an authoritarian leader regards safety value as the most important tasks, there would be positive effect on group safety outcomes through strengthening safety group norms by the emphasis on etiquette. Second, on the cross level, when a leader with authoritarian style has high priority on safety, employees may more obey the safety rules because of fear due to emphasis on absolute authority over the leader. Therefore, employees may show more safety performance and then increase individual safety outcomes. Survey data would be collected from 50 manufacturing groups (each group with more than 5 members and a leader) and a hierarchical linear modeling analysis would be conducted to analyze the hypothesis. Above the predictive result, the study expects to be a cornerstone of safety leadership research in the Chinese academic and practice.Keywords: safety leadership, authoritarian leadership, group norms, safety behavior, supervisor safety priority
Procedia PDF Downloads 2331798 Large Scale Method to Assess the Seismic Vulnerability of Heritage Buidings: Modal Updating of Numerical Models and Vulnerability Curves
Authors: Claire Limoge Schraen, Philippe Gueguen, Cedric Giry, Cedric Desprez, Frédéric Ragueneau
Abstract:
Mediterranean area is characterized by numerous monumental or vernacular masonry structures illustrating old ways of build and live. Those precious buildings are often poorly documented, present complex shapes and loadings, and are protected by the States, leading to legal constraints. This area also presents a moderate to high seismic activity. Even moderate earthquakes can be magnified by local site effects and cause collapse or significant damage. Moreover the structural resistance of masonry buildings, especially when less famous or located in rural zones has been generally lowered by many factors: poor maintenance, unsuitable restoration, ambient pollution, previous earthquakes. Recent earthquakes prove that any damage to these architectural witnesses to our past is irreversible, leading to the necessity of acting preventively. This means providing preventive assessments for hundreds of structures with no or few documents. In this context we want to propose a general method, based on hierarchized numerical models, to provide preliminary structural diagnoses at a regional scale, indicating whether more precise investigations and models are necessary for each building. To this aim, we adapt different tools, being developed such as photogrammetry or to be created such as a preprocessor starting from pictures to build meshes for a FEM software, in order to allow dynamic studies of the buildings of the panel. We made an inventory of 198 baroque chapels and churches situated in the French Alps. Then their structural characteristics have been determined thanks field surveys and the MicMac photogrammetric software. Using structural criteria, we determined eight types of churches and seven types for chapels. We studied their dynamical behavior thanks to CAST3M, using EC8 spectrum and accelerogramms of the studied zone. This allowed us quantifying the effect of the needed simplifications in the most sensitive zones and choosing the most effective ones. We also proposed threshold criteria based on the observed damages visible in the in situ surveys, old pictures and Italian code. They are relevant in linear models. To validate the structural types, we made a vibratory measures campaign using vibratory ambient noise and velocimeters. It also allowed us validating this method on old masonry and identifying the modal characteristics of 20 churches. Then we proceeded to a dynamic identification between numerical and experimental modes. So we updated the linear models thanks to material and geometrical parameters, often unknown because of the complexity of the structures and materials. The numerically optimized values have been verified thanks to the measures we made on the masonry components in situ and in laboratory. We are now working on non-linear models redistributing the strains. So we validate the damage threshold criteria which we use to compute the vulnerability curves of each defined structural type. Our actual results show a good correlation between experimental and numerical data, validating the final modeling simplifications and the global method. We now plan to use non-linear analysis in the critical zones in order to test reinforcement solutions.Keywords: heritage structures, masonry numerical modeling, seismic vulnerability assessment, vibratory measure
Procedia PDF Downloads 4921797 Transparency Obligations under the AI Act Proposal: A Critical Legal Analysis
Authors: Michael Lognoul
Abstract:
In April 2021, the European Commission released its AI Act Proposal, which is the first policy proposal at the European Union level to target AI systems comprehensively, in a horizontal manner. This Proposal notably aims to achieve an ecosystem of trust in the European Union, based on the respect of fundamental rights, regarding AI. Among many other requirements, the AI Act Proposal aims to impose several generic transparency obligationson all AI systems to the benefit of natural persons facing those systems (e.g. information on the AI nature of systems, in case of an interaction with a human). The Proposal also provides for more stringent transparency obligations, specific to AI systems that qualify as high-risk, to the benefit of their users, notably on the characteristics, capabilities, and limitations of the AI systems they use. Against that background, this research firstly presents all such transparency requirements in turn, as well as related obligations, such asthe proposed obligations on record keeping. Secondly, it focuses on a legal analysis of their scope of application, of the content of the obligations, and on their practical implications. On the scope of transparency obligations tailored for high-risk AI systems, the research notably notes that it seems relatively narrow, given the proposed legal definition of the notion of users of AI systems. Hence, where end-users do not qualify as users, they may only receive very limited information. This element might potentially raise concern regarding the objective of the Proposal. On the content of the transparency obligations, the research highlights that the information that should benefit users of high-risk AI systems is both very broad and specific, from a technical perspective. Therefore, the information required under those obligations seems to create, prima facie, an adequate framework to ensure trust for users of high-risk AI systems. However, on the practical implications of these transparency obligations, the research notes that concern arises due to potential illiteracy of high-risk AI systems users. They might not benefit from sufficient technical expertise to fully understand the information provided to them, despite the wording of the Proposal, which requires that information should be comprehensible to its recipients (i.e. users).On this matter, the research points that there could be, more broadly, an important divergence between the level of detail of the information required by the Proposal and the level of expertise of users of high-risk AI systems. As a conclusion, the research provides policy recommendations to tackle (part of) the issues highlighted. It notably recommends to broaden the scope of transparency requirements for high-risk AI systems to encompass end-users. It also suggests that principles of explanation, as they were put forward in the Guidelines for Trustworthy AI of the High Level Expert Group, should be included in the Proposal in addition to transparency obligations.Keywords: aI act proposal, explainability of aI, high-risk aI systems, transparency requirements
Procedia PDF Downloads 3151796 Modeling Competition Between Subpopulations with Variable DNA Content in Resource-Limited Microenvironments
Authors: Parag Katira, Frederika Rentzeperis, Zuzanna Nowicka, Giada Fiandaca, Thomas Veith, Jack Farinhas, Noemi Andor
Abstract:
Resource limitations shape the outcome of competitions between genetically heterogeneous pre-malignant cells. One example of such heterogeneity is in the ploidy (DNA content) of pre-malignant cells. A whole-genome duplication (WGD) transforms a diploid cell into a tetraploid one and has been detected in 28-56% of human cancers. If a tetraploid subclone expands, it consistently does so early in tumor evolution, when cell density is still low, and competition for nutrients is comparatively weak – an observation confirmed for several tumor types. WGD+ cells need more resources to synthesize increasing amounts of DNA, RNA, and proteins. To quantify resource limitations and how they relate to ploidy, we performed a PAN cancer analysis of WGD, PET/CT, and MRI scans. Segmentation of >20 different organs from >900 PET/CT scans were performed with MOOSE. We observed a strong correlation between organ-wide population-average estimates of Oxygen and the average ploidy of cancers growing in the respective organ (Pearson R = 0.66; P= 0.001). In-vitro experiments using near-diploid and near-tetraploid lineages derived from a breast cancer cell line supported the hypothesis that DNA content influences Glucose- and Oxygen-dependent proliferation-, death- and migration rates. To model how subpopulations with variable DNA content compete in the resource-limited environment of the human brain, we developed a stochastic state-space model of the brain (S3MB). The model discretizes the brain into voxels, whereby the state of each voxel is defined by 8+ variables that are updated over time: stiffness, Oxygen, phosphate, glucose, vasculature, dead cells, migrating cells and proliferating cells of various DNA content, and treat conditions such as radiotherapy and chemotherapy. Well-established Fokker-Planck partial differential equations govern the distribution of resources and cells across voxels. We applied S3MB on sequencing and imaging data obtained from a primary GBM patient. We performed whole genome sequencing (WGS) of four surgical specimens collected during the 1ˢᵗ and 2ⁿᵈ surgeries of the GBM and used HATCHET to quantify its clonal composition and how it changes between the two surgeries. HATCHET identified two aneuploid subpopulations of ploidy 1.98 and 2.29, respectively. The low-ploidy clone was dominant at the time of the first surgery and became even more dominant upon recurrence. MRI images were available before and after each surgery and registered to MNI space. The S3MB domain was initiated from 4mm³ voxels of the MNI space. T1 post and T2 flair scan acquired after the 1ˢᵗ surgery informed tumor cell densities per voxel. Magnetic Resonance Elastography scans and PET/CT scans informed stiffness and Glucose access per voxel. We performed a parameter search to recapitulate the GBM’s tumor cell density and ploidy composition before the 2ⁿᵈ surgery. Results suggest that the high-ploidy subpopulation had a higher Glucose-dependent proliferation rate (0.70 vs. 0.49), but a lower Glucose-dependent death rate (0.47 vs. 1.42). These differences resulted in spatial differences in the distribution of the two subpopulations. Our results contribute to a better understanding of how genomics and microenvironments interact to shape cell fate decisions and could help pave the way to therapeutic strategies that mimic prognostically favorable environments.Keywords: tumor evolution, intra-tumor heterogeneity, whole-genome doubling, mathematical modeling
Procedia PDF Downloads 731795 Experimental Measurements of Mean and Turbulence Quantities behind the Circular Cylinder by Attaching Different Number of Tripping Wires
Authors: Amir Bak Khoshnevis, Mahdieh Khodadadi, Aghil Lotfi
Abstract:
For a bluff body, roughness elements in simulating a turbulent boundary layer, leading to delayed flow separation, a smaller wake, and lower form drag. In the present work, flow past a circular cylinder with using tripping wires is studied experimentally. The wind tunnel used for modeling free stream is open blow circuit (maximum speed = 30m/s and maximum turbulence of free stream = 0.1%). The selected Reynolds number for all tests was constant (Re = 25000). The circular cylinder selected for this experiment is 20 and 400mm in diameter and length, respectively. The aim of this research is to find the optimal operation mode. In this study installed some tripping wires 1mm in diameter, with a different number of wires on the circular cylinder and the wake characteristics of the circular cylinder is studied. Results showed that by increasing number of tripping wires attached to the circular cylinder (6, 8, and 10, respectively), The optimal angle for the tripping wires with 1mm in diameter to be installed on the cylinder is 60̊ (or 6 wires required at angle difference of 60̊). Strouhal number for the cylinder with tripping wires 1mm in diameter at angular position 60̊ showed the maximum value.Keywords: wake of circular cylinder, trip wire, velocity defect, strouhal number
Procedia PDF Downloads 4021794 Control of Grid Connected PMSG-Based Wind Turbine System with Back-To-Back Converter Topology Using Resonant Controller
Authors: Fekkak Bouazza, Menaa Mohamed, Loukriz Abdelhamid, Krim Mohamed L.
Abstract:
This paper presents modeling and control strategy for the grid connected wind turbine system based on Permanent Magnet Synchronous Generator (PMSG). The considered system is based on back-to-back converter topology. The Grid Side Converter (GSC) achieves the DC bus voltage control and unity power factor. The Machine Side Converter (MSC) assures the PMSG speed control. The PMSG is used as a variable speed generator and connected directly to the turbine without gearbox. The pitch angle control is not either considered in this study. Further, Optimal Tip Speed Ratio (OTSR) based MPPT control strategy is used to ensure the most energy efficiency whatever the wind speed variations. A filter (L) is put between the GSC and the grid to reduce current ripple and to improve the injected power quality. The proposed grid connected wind system is built under MATLAB/Simulink environment. The simulation results show the feasibility of the proposed topology and performance of its control strategies.Keywords: wind, grid, PMSG, MPPT, OTSR
Procedia PDF Downloads 3621793 Exploring Syntactic and Semantic Features for Text-Based Authorship Attribution
Authors: Haiyan Wu, Ying Liu, Shaoyun Shi
Abstract:
Authorship attribution is to extract features to identify authors of anonymous documents. Many previous works on authorship attribution focus on statistical style features (e.g., sentence/word length), content features (e.g., frequent words, n-grams). Modeling these features by regression or some transparent machine learning methods gives a portrait of the authors' writing style. But these methods do not capture the syntactic (e.g., dependency relationship) or semantic (e.g., topics) information. In recent years, some researchers model syntactic trees or latent semantic information by neural networks. However, few works take them together. Besides, predictions by neural networks are difficult to explain, which is vital in authorship attribution tasks. In this paper, we not only utilize the statistical style and content features but also take advantage of both syntactic and semantic features. Different from an end-to-end neural model, feature selection and prediction are two steps in our method. An attentive n-gram network is utilized to select useful features, and logistic regression is applied to give prediction and understandable representation of writing style. Experiments show that our extracted features can improve the state-of-the-art methods on three benchmark datasets.Keywords: authorship attribution, attention mechanism, syntactic feature, feature extraction
Procedia PDF Downloads 1361792 Re-Reading the Impossibility of Identity: Modeling Gender Pluralism in Curriculum and Instruction
Authors: A. K. O’Loughlin
Abstract:
Identity doesn’t exist in discrete categories as it is defined. Kevin Kumashiro reveals the phrase 'an impossibility of identity' in Troubling Education (2000), an investigation of the intersections of culture and gender and the impact of erasure for queer POC identity. This underscores the essentiality of an insider or an outsider identity and the appearance of 'contradiction' or impossibility of these identities. The contradictions between us as subject in our own stories and in the stories of others are often silenced. This silencing of complex, 'contradicting' identity has unmissable implications in the classroom; the developing student in question is done a serious disservice, from which they may never recover. There is no more important point of contact than the teacher, for willingness to encounter a developing person as they are, not as we already think they are, or 'know' them to be, or think they should be. To decide how to regard them based on our own unilateral identity and its associated exhortations and injunctions is, as Hannah Arendt writes in The Origins of Totalitarianism (1951), to sell off our ability to rise, human-like, to the challenge of investigating things as they are. A re-reading of Kumashiro’s impossibility of identity becomes possible through the investigation of pluralism. Identities become possible and un-paradoxical by the notion that contradictions are not problems that an individual is not unilateral, but plural. In this paper, we investigate how philosophies of pluralism can inform our understanding of impossibility of identity in classroom curriculum and pedagogy.Keywords: identity, gender, culture, pluralism, education, philosophy of education, queer theory, philosophy of mind, adolescent development
Procedia PDF Downloads 2991791 Diverse High-Performing Teams: An Interview Study on the Balance of Demands and Resources
Authors: Alana E. Jansen
Abstract:
With such a large proportion of organisations relying on the use of team-based structures, it is surprising that so few teams would be classified as high-performance teams. While the impact of team composition on performance has been researched frequently, there have been conflicting findings as to the effects, particularly when examined alongside other team factors. To broaden the theoretical perspectives on this topic and potentially explain some of the inconsistencies in research findings left open by other various models of team effectiveness and high-performing teams, the present study aims to use the Job-Demands-Resources model, typically applied to burnout and engagement, as a framework to examine how team composition factors (particularly diversity in team member characteristics) can facilitate or hamper team effectiveness. This study used a virtual interview design where participants were asked to both rate and describe their experiences, in one high-performing and one low-performing team, over several factors relating to demands, resources, team composition, and team effectiveness. A semi-structured interview protocol was developed, which combined the use of the Likert style and exploratory questions. A semi-targeted sampling approach was used to invite participants ranging in age, gender, and ethnic appearance (common surface-level diversity characteristics) and those from different specialties, roles, educational and industry backgrounds (deep-level diversity characteristics). While the final stages of data analyses are still underway, thematic analysis using a grounded theory approach was conducted concurrently with data collection to identify the point of thematic saturation, resulting in 35 interviews being completed. Analyses examine differences in perceptions of demands and resources as they relate to perceived team diversity. Preliminary results suggest that high-performing and low-performing teams differ in perceptions of the type and range of both demands and resources. The current research is likely to offer contributions to both theory and practice. The preliminary findings suggest there is a range of demands and resources which vary between high and low-performing teams, factors which may play an important role in team effectiveness research going forward. Findings may assist in explaining some of the more complex interactions between factors experienced in the team environment, making further progress towards understanding the intricacies of why only some teams achieve high-performance status.Keywords: diversity, high-performing teams, job demands and resources, team effectiveness
Procedia PDF Downloads 1871790 Comparison between the Efficiency of Heterojunction Thin Film InGaP\GaAs\Ge and InGaP\GaAs Solar Cell
Authors: F. Djaafar, B. Hadri, G. Bachir
Abstract:
This paper presents the design parameters for a thin film 3J InGaP/GaAs/Ge solar cell with a simulated maximum efficiency of 32.11% using Tcad Silvaco. Design parameters include the doping concentration, molar fraction, layers’ thickness and tunnel junction characteristics. An initial dual junction InGaP/GaAs model of a previous published heterojunction cell was simulated in Tcad Silvaco to accurately predict solar cell performance. To improve the solar cell’s performance, we have fixed meshing, material properties, models and numerical methods. However, thickness and layer doping concentration were taken as variables. We, first simulate the InGaP\GaAs dual junction cell by changing the doping concentrations and thicknesses which showed an increase in efficiency. Next, a triple junction InGaP/GaAs/Ge cell was modeled by adding a Ge layer to the previous dual junction InGaP/GaAs model with an InGaP /GaAs tunnel junction.Keywords: heterojunction, modeling, simulation, thin film, Tcad Silvaco
Procedia PDF Downloads 3691789 Visual and Chemical Servoing of a Hexapod Robot in a Confined Environment Using Jacobian Estimator
Authors: Guillaume Morin-Duponchelle, Ahmed Nait Chabane, Benoit Zerr, Pierre Schoesetters
Abstract:
Industrial inspection can be achieved through robotic systems, allowing visual and chemical servoing. A popular scheme for visual servo-controlled robotic is the image-based servoing sys-tems. In this paper, an approach of visual and chemical servoing of a hexapod robot using a visual and chemical Jacobian matrix are proposed. The basic idea behind the visual Jacobian matrix is modeling the differential relationship between the camera system and the robotic control system to detect and track accurately points of interest in confined environments. This approach allows the robot to easily detect and navigates to the QR code or seeks a gas source localization using surge cast algorithm. To track the QR code target, a visual servoing based on Jacobian matrix is used. For chemical servoing, three gas sensors are embedded on the hexapod. A Jacobian matrix applied to the gas concentration measurements allows estimating the direction of the main gas source. The effectiveness of the proposed scheme is first demonstrated on simulation. Finally, a hexapod prototype is designed and built and the experimental validation of the approach is presented and discussed.Keywords: chemical servoing, hexapod robot, Jacobian matrix, visual servoing, navigation
Procedia PDF Downloads 1251788 Prescription of Maintenance Fluids in the Emergency Department
Authors: Adrian Craig, Jonathan Easaw, Rose Jordan, Ben Hall
Abstract:
The prescription of intravenous fluids is a fundamental component of inpatient management, but it is one which usually lacks thought. Fluids are a drug, which like any other can cause harm when prescribed inappropriately or wrongly. However, it is well recognised that it is poorly done, especially in the acute portals. The National Institute for Health and Care Excellence (NICE) recommends 1mmol/kg of potassium, sodium, and chloride per day. With various options of fluids, clinicians tend to face difficulty in choosing the most appropriate maintenance fluid, and there is a reluctance to prescribe potassium as part of an intravenous maintenance fluid regime. The aim was to prospectively audit the prescription of the first bag of intravenous maintenance fluids, the use of urea and electrolytes results to guide the choice of fluid and the use of fluid prescription charts, in a busy emergency department of a major trauma centre in Stoke-on-Trent, United Kingdom. This was undertaken over a week in early November 2016. Of those prescribed maintenance fluid only 8.9% were prescribed a fluid which was most appropriate for their daily electrolyte requirements. This audit has helped to highlight further the issues that are faced in busy Emergency Departments within hospitals that are stretched and lack capacity for prompt transfer to a ward. It has supported the findings of NICE, that emergency admission portals such as Emergency Departments poorly prescribed intravenous fluid therapy. The findings have enabled simple steps to be taken to educate clinicians about their fluid of choice. This has included: posters to remind clinicians to consider the urea and electrolyte values before prescription, suggesting the inclusion of a suggested intravenous fluid of choice in the prescription chart of the trust and the inclusion of a session within the introduction programme revising intravenous fluid therapy and daily electrolyte requirements. Moving forward, once the interventions have been implemented then, the data will be reaudited in six months to note any improvement in maintenance fluid choice. Alongside this, an audit of the rate of intravenous maintenance fluid therapy would be proposed to further increase patient safety by avoiding unintentional fluid overload which may cause unnecessary harm to patients within the hospital. In conclusion, prescription of maintenance fluid therapy was poor within the Emergency Department, and there is a great deal of opportunity for improvement. Therefore, the measures listed above will be implemented and the data reaudited.Keywords: chloride, electrolyte, emergency department, emergency medicine, fluid, fluid therapy, intravenous, maintenance, major trauma, potassium, sodium, trauma
Procedia PDF Downloads 3221787 BIM-based Construction Noise Management Approach With a Focus on Inner-City Construction
Authors: Nasim Babazadeh
Abstract:
Growing demand for a quieter dwelling environment has turned the attention of construction companies to reducing the propagated noise of their project. In inner-city constructions, close distance between the construction site and surrounding buildings lessens the efficiency of passive noise control methods. Dwellers of the nearby areas may file complaints and lawsuits against the construction companies due to the emitted construction noise, thereby leading to the interruption of processes, compensation costs, or even suspension of the project. Therefore, construction noise should be predicted along with the project schedule. The advantage of managing the noise in the pre-construction phase is two-fold. Firstly, changes in the time plan and construction methods can be applied more flexibly. Thus, the costs related to rescheduling can be avoided. Secondly, noise-related legal problems are expected to be reduced. To implement noise mapping methods for the mentioned prediction, the required detailed information (such as the location of the noisy process, duration of the noisy work) can be exported from the 4D BIM model. The results obtained from the noise maps would be used to help the planners to define different work scenarios. The proposed approach has been applied for the foundation and earthwork of a site located in a residential area, and the obtained results are discussed.Keywords: building information modeling, construction noise management, noise mapping, 4D BIM
Procedia PDF Downloads 1851786 Rohingya Refugees and Bangladesh: Balance of Human Rights and Rationalization
Authors: Kudrat-E-Khuda Babu
Abstract:
Rohingya refugees are the most marginalized and persecuted section of people in the world. The heinous brutality of Myanmar has forced the Muslim minority community to flee themselves to their neighboring country, Bangladesh for quite a few times now. The recent atrocity of the Buddhist country has added insult to injury on the existing crisis. In lieu of protection, the rights of the Rohingya community in Myanmar are being violated through exclusion from citizenship and steamroller of persecution. The mass influx of Rohingya refugees to Bangladesh basically took place in 1978, 1992, 2012, and 2017. At present, there are around one million Rohingyas staying at Teknaf, Ukhiya of Cox’s Bazar, the southern part of Bangladesh. The country, despite being a poverty-stricken one, has shown unprecedented generosity in sheltering the Rohingya people. For sheltering half of the total refugees in 2017, the Prime Minister of Bangladesh, Sheikh Hasina is now being regarded as the lighthouse of humanity or the mother of humanity. Though Bangladesh is not a ratifying state of the UN Refugee Convention, 1951 and its Additional Protocol, 1967, the country cannot escape its obligation under international human rights jurisprudence. Bangladesh is a party to eight human rights instruments out of nine core instruments, and thus, the country has an indirect obligation to protect and promote the rights of the refugees. Pressure from international bodies has also made Bangladesh bound to provide refuge to Rohingya people. Even though the demographic vulnerability and socio-economic condition of the country do not suggest taking over extra responsibility, the principle of non-refoulment as a part of customary international law reminds us to stay beside those persecuted or believed to have well-founded fear of persecution. In the case of HM Ershad v. Bangladesh and Others, 7 BLC (AD) 67, it was held that any international treaty or document after signing or ratification is not directly enforceable unless and until the parliament enacts a similar statute howsoever sweet the document is. As per Article 33(2) of the 1951 Refugee Convention, there are even exceptions for a state party in case of serious consequences like threat to national security, apprehension of serious crime and danger to safeguard state population. Bangladesh is now at a cross-road of human rights and national interest. The world community should come forward to resolve the crisis of the persecuted Rohingya people through repatriation, resettlement, and reintegration.Keywords: Rohingya refugees, human rights, Bangladesh, Myanmar
Procedia PDF Downloads 1881785 Study of the Late Phase of Core Degradation during Reflooding by Safety Injection System for VVER1000 with ASTECv2 Computer Code
Authors: Antoaneta Stefanova, Rositsa Gencheva, Pavlin Groudev
Abstract:
This paper presents the modeling approach in SBO sequence for VVER 1000 reactors and describes the reactor core behavior at late in-vessel phase in case of late reflooding by HPIS and gives preliminary results for the ASTECv2 validation. The work is focused on investigation of plant behavior during total loss of power and the operator actions. The main goal of these analyses is to assess the phenomena arising during the Station blackout (SBO) followed by primary side high pressure injection system (HPIS) reflooding of already damaged reactor core at very late ‘in-vessel’ phase. The purpose of the analysis is to define how the later HPIS switching on can delay the time of vessel failure or possibly avoid vessel failure. For this purpose has been simulated an SBO scenario with injection of cold water by a high pressure pump (HPP) in cold leg at different stages of core degradation. The times for HPP injection were chosen based on previously performed investigations.Keywords: VVER, operator action validation, reflooding of overheated reactor core, ASTEC computer code
Procedia PDF Downloads 4151784 Hydro-Sedimentological Evaluation in Itajurú Channel–Araruama Lagoon-Rj, Due Superelevation of the Sea Level by Climate Change
Authors: Paulo José Sigaúque, Paulo Rosman
Abstract:
The Itajurú channel, located in the Eastern side of the Araruama lagoon, Rio de Janeiro state, is the one who makes the connection between Araruama lagoon and the sea. It is important to understand the hydrodynamic circulation of the location and effects of the sedimentological processes, and also estimate of the hydrodynamic and sedimentological processes in the future after the sea level change due to effects of climate change. This work presents results of a study about sediments dynamics in the Araruama lagoon focusing on the Itajurú channel region considering the present mean sea level and a foreseen sea level rise of 0.5 meters due to climate changes. The study was conducted with the aid of computer modeling for hydrodynamic and morphodynamic in SisBaHiA®. The results indicate that Araruama lagoon is composed by two hydrodynamics compartments; one is dominated by the action of the tide between the entrance of the channel and the strait of Perynas, and another one by the action of wind in narrow region between strait of Perynas and western extreme of the lagoon. With sea level rise, the magnitude of current velocities and flow rates is increased and consequently flow of sediment transport from upstream to downstream of Itajurú channel is increased and has more effect in the bridge Feliciano Sodré.Keywords: hydrodinamic, superelevation, sea level, climate change
Procedia PDF Downloads 3051783 Tommy: Communication in Education about Disability
Authors: Karen V. Lee
Abstract:
The background and significance of this study involve communication in education by a faculty advisor exploring story and music that informs others about a disabled teacher. Social issues draw deep reflection about the emotional turmoil. As a musician becoming a teacher is a passionate yet complex endeavor, the faculty advisor shares a poetic but painful story about a disabled teacher being inducted into the teaching profession. The qualitative research method as theoretical framework draws on autoethnography of music and story where the faculty advisor approaches a professor for advice. His musicianship shifts her forward, backward, and sideways through feelings that evoke and provoke curriculum to remove communication barriers in education. They discover they do not transfer knowledge from educational method classes. Instead, the autoethnography embeds musical language as a metaphorical conduit for removing communication barriers in teacher education. Sub-themes involve communication barriers and educational technologies to ensure teachers receive social, emotional, physical, spiritual, and intervention disability resources that evoke visceral, emotional responses from the audience. Major findings of the study discover how autoethnography of music and story bring the authors to understand wider political issues of the practicum internship for teachers with disabilities. An epiphany reveals the irony of living in a culture of both uniformity and diversity. They explore the constructs of secrecy, ideology, abnormality, and marginalization by evoking visceral and emotional responses from the audience. As the voices harmonize plot, climax, characterization, and denouement, they dramatize meaning that is episodic yet incomplete to highlight the circumstances surrounding the disabled protagonist’s life. In conclusion, the qualitative research method argues for embracing storied experiences that depict communication in education. Scholarly significance embraces personal thoughts and feelings as a way of understanding social phenomena while highlighting the importance of removing communication barriers in education. The circumstance about a teacher with a disability is not uncommon in society. Thus, the authors resolve to removing barriers in education by using stories to transform the personal and cultural influences that provoke new ways of thinking about the curriculum for a disabled teacher.Keywords: communication in education, communication barriers, autoethnography, teaching
Procedia PDF Downloads 2401782 Modeling the Elastic Mean Free Path of Electron Collision with Pyrimidine: The Screen Corrected Additivity Rule Method
Authors: Aouina Nabila Yasmina, Chaoui Zine El Abiddine
Abstract:
This study presents a comprehensive investigation into the elastic mean free path (EMFP) of electrons colliding with pyrimidine, a precursor to the pyrimidine bases in DNA, employing the Screen Corrected Additivity Rule (SCAR) method. The SCAR method is introduced as a novel approach that combines classical and quantum mechanical principles to elucidate the interaction of electrons with pyrimidine. One of the most fundamental properties characterizing the propagation of a particle in the nuclear medium is its mean free path. Knowledge of the elastic mean free path is essential to accurately predict the effects of radiation on biological matter, as it contributes to the distances between collisions. Additionally, the mean free path plays a role in the interpretation of almost all experiments in which an excited electron moves through a solid. Pyrimidine, the precursor of the pyrimidine bases of DNA, has interesting physicochemical properties, which make it an interesting molecule to study from a fundamental point of view. These include a relatively large dipole polarizability and dipole moment and an electronic charge cloud with a significant spatial extension, which justifies its choice in this present study.Keywords: elastic mean free path, elastic collision, pyrimidine, SCAR
Procedia PDF Downloads 641781 Reconceptualizing “Best Practices” in Public Sector
Authors: Eftychia Kessopoulou, Styliani Xanthopoulou, Ypatia Theodorakioglou, George Tsiotras, Katerina Gotzamani
Abstract:
Public sector managers frequently herald that implementing best practices as a set of standards, may lead to superior organizational performance. However, recent research questions the objectification of best practices, highlighting: a) the inability of public sector organizations to develop innovative administrative practices, as well as b) the adoption of stereotypical renowned practices inculcated in the public sector by international governance bodies. The process through which organizations construe what a best practice is, still remains a black box that is yet to be investigated, given the trend of continuous changes in public sector performance, as well as the burgeoning interest of sharing popular administrative practices put forward by international bodies. This study aims to describe and understand how organizational best practices are constructed by public sector performance management teams, like benchmarkers, during the benchmarking-mediated performance improvement process and what mechanisms enable this construction. A critical realist action research methodology is employed, starting from a description of various approaches on best practice nature when a benchmarking-mediated performance improvement initiative, such as the Common Assessment Framework, is applied. Firstly, we observed the benchmarker’s management process of best practices in a public organization, so as to map their theories-in-use. As a second step we contextualized best administrative practices by reflecting the different perspectives emerged from the previous stage on the design and implementation of an interview protocol. We used this protocol to conduct 30 semi-structured interviews with “best practice” process owners, in order to examine their experiences and performance needs. Previous research on best practices has shown that needs and intentions of benchmarkers cannot be detached from the causal mechanisms of the various contexts in which they work. Such causal mechanisms can be found in: a) process owner capabilities, b) the structural context of the organization, and c) state regulations. Therefore, we developed an interview protocol theoretically informed in the first part to spot causal mechanisms suggested by previous research studies and supplemented it with questions regarding the provision of best practice support from the government. Findings of this work include: a) a causal account of the nature of best administrative practices in the Greek public sector that shed light on explaining their management, b) a description of the various contexts affecting best practice conceptualization, and c) a description of how their interplay changed the organization’s best practice management.Keywords: benchmarking, action research, critical realism, best practices, public sector
Procedia PDF Downloads 1271780 Entrepreneurial Practice and Corruption in Tourism Sector: A Study of Entrepreneurial Orientation and Organizational Corruption in Nepali Star Hotels
Authors: Prabin Raj Gautam
Abstract:
Entrepreneurship in tourism sectors, particularly hotel entrepreneurship has contributed to Nepalese Gross Domestic Production (GDP). The tourist standard and star hotels in developing countries have not only been generating revenues but also providing international hospitality to the guest in the local areas. For doing so, these hotel enterprises must need to implement different business strategies to enhance and maintain their international business benchmark. The Entrepreneurial Orientation (EO) is core for making business strategies. Meanwhile, the corruption is labeled as negative factor for economic development. This paper presents the relationship between EO of Nepalese star hotels and organizational corruption. The study employed questionnaire survey as data collection tool under the quantitative methodology. Five hypotheses are developed and tested. After gathering the data form 216 questionnaire distributed to CEOs/Managers of the sample hotels, the findings show that out of five dimensions of EO, only autonomy, pro-activeness, and innovativeness are not significant to organizational corruption; however, risk-taking and competitive aggressiveness are found significant contributor. The descriptive statistics and structural equation modeling are employed to describe the data and fit the model.Keywords: entrepreneurship, entrepreneurial orientation, organizational corruption, dimensions
Procedia PDF Downloads 3181779 Driver Behavior Analysis and Inter-Vehicular Collision Simulation Approach
Authors: Lu Zhao, Nadir Farhi, Zoi Christoforou, Nadia Haddadou
Abstract:
The safety test of deploying intelligent connected vehicles (ICVs) on the road network is a critical challenge. Road traffic network simulation can be used to test the functionality of ICVs, which is not only time-saving and less energy-consuming but also can create scenarios with car collisions. However, the relationship between different human driver behaviors and the car-collision occurrences has been not understood clearly; meanwhile, the procedure of car-collisions generation in the traffic numerical simulators is not fully integrated. In this paper, we propose an approach to identify specific driver profiles from real driven data; then, we replicate them in numerical traffic simulations with the purpose of generating inter-vehicular collisions. We proposed three profiles: (i) 'aggressive': short time-headway, (ii) 'inattentive': long reaction time, and (iii) 'normal' with intermediate values of reaction time and time-headway. These three driver profiles are extracted from the NGSIM dataset and simulated using the intelligent driver model (IDM), with an extension of reaction time. At last, the generation of inter-vehicular collisions is performed by varying the percentages of different profiles.Keywords: vehicular collisions, human driving behavior, traffic modeling, car-following models, microscopic traffic simulation
Procedia PDF Downloads 1711778 Hybrid Knowledge and Data-Driven Neural Networks for Diffuse Optical Tomography Reconstruction in Medical Imaging
Authors: Paola Causin, Andrea Aspri, Alessandro Benfenati
Abstract:
Diffuse Optical Tomography (DOT) is an emergent medical imaging technique which employs NIR light to estimate the spatial distribution of optical coefficients in biological tissues for diagnostic purposes, in a noninvasive and non-ionizing manner. DOT reconstruction is a severely ill-conditioned problem due to prevalent scattering of light in the tissue. In this contribution, we present our research in adopting hybrid knowledgedriven/data-driven approaches which exploit the existence of well assessed physical models and build upon them neural networks integrating the availability of data. Namely, since in this context regularization procedures are mandatory to obtain a reasonable reconstruction [1], we explore the use of neural networks as tools to include prior information on the solution. 2. Materials and Methods The idea underlying our approach is to leverage neural networks to solve PDE-constrained inverse problems of the form 𝒒 ∗ = 𝒂𝒓𝒈 𝒎𝒊𝒏𝒒 𝐃(𝒚, 𝒚̃), (1) where D is a loss function which typically contains a discrepancy measure (or data fidelity) term plus other possible ad-hoc designed terms enforcing specific constraints. In the context of inverse problems like (1), one seeks the optimal set of physical parameters q, given the set of observations y. Moreover, 𝑦̃ is the computable approximation of y, which may be as well obtained from a neural network but also in a classic way via the resolution of a PDE with given input coefficients (forward problem, Fig.1 box ). Due to the severe ill conditioning of the reconstruction problem, we adopt a two-fold approach: i) we restrict the solutions (optical coefficients) to lie in a lower-dimensional subspace generated by auto-decoder type networks. This procedure forms priors of the solution (Fig.1 box ); ii) we use regularization procedures of type 𝒒̂ ∗ = 𝒂𝒓𝒈𝒎𝒊𝒏𝒒 𝐃(𝒚, 𝒚̃)+ 𝑹(𝒒), where 𝑹(𝒒) is a regularization functional depending on regularization parameters which can be fixed a-priori or learned via a neural network in a data-driven modality. To further improve the generalizability of the proposed framework, we also infuse physics knowledge via soft penalty constraints (Fig.1 box ) in the overall optimization procedure (Fig.1 box ). 3. Discussion and Conclusion DOT reconstruction is severely hindered by ill-conditioning. The combined use of data-driven and knowledgedriven elements is beneficial and allows to obtain improved results, especially with a restricted dataset and in presence of variable sources of noise.Keywords: inverse problem in tomography, deep learning, diffuse optical tomography, regularization
Procedia PDF Downloads 741777 Impact of Diabetes Mellitus Type 2 on Clinical In-Stent Restenosis in First Elective Percutaneous Coronary Intervention Patients
Authors: Leonard Simoni, Ilir Alimehmeti, Ervina Shirka, Endri Hasimi, Ndricim Kallashi, Verona Beka, Suerta Kabili, Artan Goda
Abstract:
Background: Diabetes Mellitus type 2, small vessel calibre, stented length of vessel, complex lesion morphology, and prior bypass surgery have resulted risk factors for In-Stent Restenosis (ISR). However, there are some contradictory results about body mass index (BMI) as a risk factor for ISR. Purpose: We want to identify clinical, lesional and procedural factors that can predict clinical ISR in our patients. Methods: Were enrolled 759 patients who underwent first-time elective PCI with Bare Metal Stents (BMS) from September 2011 to December 2013 in our Department of Cardiology and followed them for at least 1.5 years with a median of 862 days (2 years and 4 months). Only the patients re-admitted with ischemic heart disease underwent control coronary angiography but no routine angiographic control was performed. Patients were categorized in ISR and non-ISR groups and compared between them. Multivariate analysis - Binary Logistic Regression: Forward Conditional Method was used to identify independent predictive risk factors. P was considered statistically significant when <0.05. Results: ISR compared to non-ISR individuals had a significantly lower BMI (25.7±3.3 vs. 26.9±3.7, p=0.004), higher risk anatomy (LM + 3-vessel CAD) (23% vs. 14%, p=0.03), higher number of stents/person used (2.1±1.1 vs. 1.75±0.96, p=0.004), greater length of stents/person used (39.3±21.6 vs. 33.3±18.5, p=0.01), and a lower use of clopidogrel and ASA (together) (95% vs. 99%, p=0.012). They also had a higher, although not statistically significant, prevalence of Diabetes Mellitus (42% vs. 32%, p=0.072) and a greater number of treated vessels (1.36±0.5 vs. 1.26±0.5, p=0.08). In the multivariate analysis, Diabetes Mellitus type 2 and multiple stents used were independent predictors risk factors for In-Stent Restenosis, OR 1.66 [1.03-2.68], p=0.039, and OR 1.44 [1.16-1.78,] p=0.001, respectively. On the other side higher BMI and use of clopidogrel and ASA together resulted protective factors OR 0.88 [0.81-0.95], p=0.001 and OR 0.2 [0.06-0.72] p=0.013, respectively. Conclusion: Diabetes Mellitus and multiple stents are strong predictive risk factors, whereas the use of clopidogrel and ASA together are protective factors for clinical In-Stent Restenosis. Paradoxically High BMI is a protective factor for In-stent Restenosis, probably related to a larger diameter of vessels and consequently a larger diameter of stents implanted in these patients. Further studies are needed to clarify this finding.Keywords: body mass index, diabetes mellitus, in-stent restenosis, percutaneous coronary intervention
Procedia PDF Downloads 2101776 Optimization of Turbocharged Diesel Engines
Authors: Ebrahim Safarian, Kadir Bilen, Akif Ceviz
Abstract:
The turbocharger and turbocharging have been the inherent component of diesel engines, so that critical parameters of such engines, as BSFC(Brake Specific Fuel Consumption) or thermal efficiency, fuel consumption, BMEP(Brake Mean Effective Pressure), the power density output and emission level have been improved extensively. In general, the turbocharger can be considered as the most complex component of diesel engines, because it has closely interrelated turbomachinery concepts of the turbines and the compressors to thermodynamic fundamentals of internal combustion engines and stress analysis of all components. In this paper, a waste gate for a conventional single stage radial turbine is investigated by consideration of turbochargers operation constrains and engine operation conditions, without any detail designs in the turbine and the compressor. Amount of opening waste gate which extended between the ranges of full opened and closed valve, is demonstrated by limiting compressor boost pressure ratio. Obtaining of an optimum point by regard above mentioned items is surveyed by three linked meanline modeling programs together which consist of Turbomatch®, Compal®, Rital®madules in concepts NREC® respectively.Keywords: turbocharger, wastegate, diesel engine, concept NREC programs
Procedia PDF Downloads 2431775 Scale Effects on the Wake Airflow of a Heavy Truck
Authors: Aude Pérard Lecomte, Georges Fokoua, Amine Mehel, Anne Tanière
Abstract:
Air quality in urban areas is deteriorated by pollution, mainly due to the constant increase of the traffic of different types of ground vehicles. In particular, particulate matter pollution with important concentrations in urban areas can cause serious health issues. Characterizing and understanding particle dynamics is therefore essential to establish recommendations to improve air quality in urban areas. To analyze the effects of turbulence on particulate pollutants dispersion, the first step is to focus on the single-phase flow structure and turbulence characteristics in the wake of a heavy truck model. To achieve this, Computational Fluid Dynamics (CFD) simulations were conducted with the aim of modeling the wake airflow of a full- and reduced-scale heavy truck. The Reynolds Average Navier-Stokes (RANS) approach with the Reynolds Stress Model (RSM)as the turbulence model closure was used. The simulations highlight the apparition of a large vortex coming from the under trailer. This vortex belongs to the recirculation region, located in the near-wake of the heavy truck. These vortical structures are expected to have a strong influence on particle dynamics that are emitted by the truck.Keywords: CDF, heavy truck, recirculation region, reduced scale
Procedia PDF Downloads 2181774 Harmonization of Accreditation Standards in Education of Central Asian Countries: Theoretical Aspect
Authors: Yskak Nabi, Onolkan Umankulova, Ilyas Seitov
Abstract:
Tempus project about “Central Asian network for quality assurance – CANQA” had been implemented in 2009-2012. As the result of the project, two accreditation agencies were established: the agency for quality assurance in the field of education, “EdNet” in Kyrgyzstan, center of progressive technologies in Tajikistan. The importance of the research studies of the project is supported by the idea that the creation of Central-Asian network for quality assurance in education is still relevant, and results of the International forum “Global in regional: Kazakhstan in Bologna process and EU projects,” that was held in Nur-Sultan in October 2020, proves this. At the same time, the previous experience of the partnership between accreditation agencies of Central Asia shows that recommendations elaborated within the CANQA project were not theoretically justified. But there are a number of facts and arguments that prove the practical appliance of these recommendations. In this respect, joint activities of accreditation agencies of Kyrgyzstan and Kazakhstan are representative. For example, independent Kazakh agency of accreditation and rating successfully conducts accreditation of Kyrgyz universities; based on the memorandum about joint activity between the agency for quality assurance in the field of education “EdNet” (Kyrgyzstan) and Astana accreditation agency (Kazakhstan), the last one provides its experts for accreditation procedures in EdNet. Exchange of experience among the agencies shows an effective approach towards adaptation of European standards to the reality of education systems of Central Asia with consideration of not only a legal framework but also from the point of European practices view. Therefore, the relevance of the research is identified as there is a practical partnership between accreditation agencies of Central Asian countries, but the absence of theoretical justification of integrational processes in the accreditation field. As a result, the following hypothesis was put forward: “if to develop theoretical aspects for harmonization of accreditation standards, then integrational processes would be improved since the implementation of Bologna process principles would be supported with wider possibilities, and particularly, students and academic mobility would be improved.” Indeed, for example, in Kazakhstan, the total share of foreign students was 5,04% in 2020, and most of them are coming from Kyrgyzstan, Tajikistan, and Uzbekistan, and if integrational processes will be improved, then this share can increase.Keywords: accreditation standards in education, Central Asian countries, pedagogical theory, model
Procedia PDF Downloads 1991773 Developing High-Definition Flood Inundation Maps (HD-Fims) Using Raster Adjustment with Scenario Profiles (RASPTM)
Authors: Robert Jacobsen
Abstract:
Flood inundation maps (FIMs) are an essential tool in communicating flood threat scenarios to the public as well as in floodplain governance. With an increasing demand for online raster FIMs, the FIM State-of-the-Practice (SOP) is rapidly advancing to meet the dual requirements for high-resolution and high-accuracy—or High-Definition. Importantly, today’s technology also enables the resolution of problems of local—neighborhood-scale—bias errors that often occur in FIMs, even with the use of SOP two-dimensional flood modeling. To facilitate the development of HD-FIMs, a new GIS method--Raster Adjustment with Scenario Profiles, RASPTM—is described for adjusting kernel raster FIMs to match refined scenario profiles. With RASPTM, flood professionals can prepare HD-FIMs for a wide range of scenarios with available kernel rasters, including kernel rasters prepared from vector FIMs. The paper provides detailed procedures for RASPTM, along with an example of applying RASPTM to prepare an HD-FIM for the August 2016 Flood in Louisiana using both an SOP kernel raster and a kernel raster derived from an older vector-based flood insurance rate map. The accuracy of the HD-FIMs achieved with the application of RASPTM to the two kernel rasters is evaluated.Keywords: hydrology, mapping, high-definition, inundation
Procedia PDF Downloads 77