Search results for: grey code
431 Non-Linear Regression Modeling for Composite Distributions
Authors: Mostafa Aminzadeh, Min Deng
Abstract:
Modeling loss data is an important part of actuarial science. Actuaries use models to predict future losses and manage financial risk, which can be beneficial for marketing purposes. In the insurance industry, small claims happen frequently while large claims are rare. Traditional distributions such as Normal, Exponential, and inverse-Gaussian are not suitable for describing insurance data, which often show skewness and fat tails. Several authors have studied classical and Bayesian inference for parameters of composite distributions, such as Exponential-Pareto, Weibull-Pareto, and Inverse Gamma-Pareto. These models separate small to moderate losses from large losses using a threshold parameter. This research introduces a computational approach using a nonlinear regression model for loss data that relies on multiple predictors. Simulation studies were conducted to assess the accuracy of the proposed estimation method. The simulations confirmed that the proposed method provides precise estimates for regression parameters. It's important to note that this approach can be applied to datasets if goodness-of-fit tests confirm that the composite distribution under study fits the data well. To demonstrate the computations, a real data set from the insurance industry is analyzed. A Mathematica code uses the Fisher information algorithm as an iteration method to obtain the maximum likelihood estimation (MLE) of regression parameters.Keywords: maximum likelihood estimation, fisher scoring method, non-linear regression models, composite distributions
Procedia PDF Downloads 34430 Thermo-Mechanical Behavior of Steel-Wood Connections of Wooden Structures Under the Effect of a Fire
Authors: Ahmed Alagha, Belkacem Lamri, Abdelhak Kada.
Abstract:
Steel-wood assemblies often have complex geometric configurations whose overall behavior under the effect of a fire is conditioned by the thermal response, by combining the two materials steel and wood, whose thermal characteristics are greatly influenced by high temperatures. The objective of this work is to study the thermal behavior of a steel-wood connection, with or without insulating material, subjected to an ISO834 standard fire model. The analysis is developed by the analytical approach using the Eurocode, and numerically, by the finite element method, through the ANSYS calculation code. The design of the connections is evaluated at room temperature taking the cases of single shear and double shear. The thermal behavior of the connections is simulated in transient state while taking into account the modes of heat transfer by convection and by radiation. The variation of temperature as a function of time is evaluated in different positions of the connections while talking about the heat produced and the formation of the carbon layer. The results relate to the temperature distributions in the connection elements as a function of the duration of the fire. The results of the thermal analysis show that the temperature increases rapidly and reaches more than 260 °C in the steel material for an hour of exposure to fire. The temperature development in wood material is different from that in steel because of its thermal properties. Wood heats up on the outside and burns, its surface can reach very high temperatures in points on the surface.Keywords: Eurocode 5, finite elements, ISO834, simple shear, thermal behaviour, wood-steel connection
Procedia PDF Downloads 86429 Optimization of the Mechanical Performance of Fused Filament Fabrication Parts
Authors: Iván Rivet, Narges Dialami, Miguel Cervera, Michele Chiumenti
Abstract:
Process parameters in Additive Manufacturing (AM) play a critical role in the mechanical performance of the final component. In order to find the input configuration that guarantees the optimal performance of the printed part, the process-performance relationship must be found. Fused Filament Fabrication (FFF) is the selected demonstrative AM technology due to its great popularity in the industrial manufacturing world. A material model that considers the different printing patterns present in a FFF part is used. A voxelized mesh is built from the manufacturing toolpaths described in the G-Code file. An Adaptive Mesh Refinement (AMR) based on the octree strategy is used in order to reduce the complexity of the mesh while maintaining its accuracy. High-fidelity and cost-efficient Finite Element (FE) simulations are performed and the influence of key process parameters in the mechanical performance of the component is analyzed. A robust optimization process based on appropriate failure criteria is developed to find the printing direction that leads to the optimal mechanical performance of the component. The Tsai-Wu failure criterion is implemented due to the orthotropy and heterogeneity constitutive nature of FFF components and because of the differences between the strengths in tension and compression. The optimization loop implements a modified version of an Anomaly Detection (AD) algorithm and uses the computed metrics to obtain the optimal printing direction. The developed methodology is verified with a case study on an industrial demonstrator.Keywords: additive manufacturing, optimization, printing direction, mechanical performance, voxelization
Procedia PDF Downloads 63428 Optimization of Fused Deposition Modeling 3D Printing Process via Preprocess Calibration Routine Using Low-Cost Thermal Sensing
Authors: Raz Flieshman, Adam Michael Altenbuchner, Jörg Krüger
Abstract:
This paper presents an approach to optimizing the Fused Deposition Modeling (FDM) 3D printing process through a preprocess calibration routine of printing parameters. The core of this method involves the use of a low-cost thermal sensor capable of measuring tempera-tures within the range of -20 to 500 degrees Celsius for detailed process observation. The calibration process is conducted by printing a predetermined path while varying the process parameters through machine instructions (g-code). This enables the extraction of critical thermal, dimensional, and surface properties along the printed path. The calibration routine utilizes computer vision models to extract features and metrics from the thermal images, in-cluding temperature distribution, layer adhesion quality, surface roughness, and dimension-al accuracy and consistency. These extracted properties are then analyzed to optimize the process parameters to achieve the desired qualities of the printed material. A significant benefit of this calibration method is its potential to create printing parameter profiles for new polymer and composite materials, thereby enhancing the versatility and application range of FDM 3D printing. The proposed method demonstrates significant potential in enhancing the precision and reliability of FDM 3D printing, making it a valuable contribution to the field of additive manufacturing.Keywords: FDM 3D printing, preprocess calibration, thermal sensor, process optimization, additive manufacturing, computer vision, material profiles
Procedia PDF Downloads 41427 Systematic Review of Quantitative Risk Assessment Tools and Their Effect on Racial Disproportionality in Child Welfare Systems
Authors: Bronwen Wade
Abstract:
Over the last half-century, child welfare systems have increasingly relied on quantitative risk assessment tools, such as actuarial or predictive risk tools. These tools are developed by performing statistical analysis of how attributes captured in administrative data are related to future child maltreatment. Some scholars argue that attributes in administrative data can serve as proxies for race and that quantitative risk assessment tools reify racial bias in decision-making. Others argue that these tools provide more “objective” and “scientific” guides for decision-making instead of subjective social worker judgment. This study performs a systematic review of the literature on the impact of quantitative risk assessment tools on racial disproportionality; it examines methodological biases in work on this topic, summarizes key findings, and provides suggestions for further work. A search of CINAHL, PsychInfo, Proquest Social Science Premium Collection, and the ProQuest Dissertations and Theses Collection was performed. Academic and grey literature were included. The review includes studies that use quasi-experimental methods and development, validation, or re-validation studies of quantitative risk assessment tools. PROBAST (Prediction model Risk of Bias Assessment Tool) and CHARMS (CHecklist for critical Appraisal and data extraction for systematic Reviews of prediction Modelling Studies) were used to assess the risk of bias and guide data extraction for risk development, validation, or re-validation studies. ROBINS-I (Risk of Bias in Non-Randomized Studies of Interventions) was used to assess for bias and guide data extraction for the quasi-experimental studies identified. Due to heterogeneity among papers, a meta-analysis was not feasible, and a narrative synthesis was conducted. 11 papers met the eligibility criteria, and each has an overall high risk of bias based on the PROBAST and ROBINS-I assessments. This is deeply concerning, as major policy decisions have been made based on a limited number of studies with a high risk of bias. The findings on racial disproportionality have been mixed and depend on the tool and approach used. Authors use various definitions for racial equity, fairness, or disproportionality. These concepts of statistical fairness are connected to theories about the reason for racial disproportionality in child welfare or social definitions of fairness that are usually not stated explicitly. Most findings from these studies are unreliable, given the high degree of bias. However, some of the less biased measures within studies suggest that quantitative risk assessment tools may worsen racial disproportionality, depending on how disproportionality is mathematically defined. Authors vary widely in their approach to defining and addressing racial disproportionality within studies, making it difficult to generalize findings or approaches across studies. This review demonstrates the power of authors to shape policy or discourse around racial justice based on their choice of statistical methods; it also demonstrates the need for improved rigor and transparency in studies of quantitative risk assessment tools. Finally, this review raises concerns about the impact that these tools have on child welfare systems and racial disproportionality.Keywords: actuarial risk, child welfare, predictive risk, racial disproportionality
Procedia PDF Downloads 54426 Impact of Audit Committee on Real Earnings Management: Cases of Netherlands
Authors: Sana Masmoudi Mardassi, Yosra Makni Fourati
Abstract:
Regulators highlight the importance of the Audit Committee (AC) as a key internal corporate governance mechanism. One of the most important roles of this committee is to oversee the financial reporting process. The purpose of this paper is to examine the link between the characteristics of an audit committee and the financial reporting quality by investigating whether the characteristics of audit committees are associated with improved financial reporting quality, especially the Real Earnings Management. In the current study, a panel data from 80 nonfinancial companies listed on the Amsterdam Stock Exchange during the period between 2010 and 2017 were used. To measure audit committee characteristics, four proxies have been used, specifically, audit committee independence, financial expertise, gender diversity and AC meetings. For this research, a linear regression model was used to identify the influence of a set of board characteristics of the audit committee on real earnings management after controlling for firm audit committee size, leverage, size, loss, growth and board size. This research provides empirical evidence of the association between audit committee independence, financial expertise, gender diversity and meetings and Real Earnings Management (REM) as a proxy of financial reporting quality. The study finds that independence and AC Gender diversity are strongly related to financial reporting quality. In fact, these two characteristics constrain REM. The results also suggest that AC- financial expertise reduces to some extent, the likelihood of engaging in REM. These conclusions provide support then to the audit committee requirement under the Dutch Corporate Governance Code rules regarding gender diversity and AC meetings.Keywords: audit committee, financial expertise, independence, real earnings management
Procedia PDF Downloads 167425 Earthquake Hazards in Manipur: Casual Factors and Remedial Measures
Authors: Kangujam Monika, Kiranbala Devi Thokchom, Soibam Sandhyarani Devi
Abstract:
Earthquake is a major natural hazard in India. Manipur, located in the North-Eastern Region of India, is one of the most affected location in the region prone to earthquakes since it lies in an area where Indian and Eurasian tectonic plates meet and is in seismic Zone V which is the most severe intensity zone, according to IS Code. Some recent earthquakes recorded in Manipur are M 6.7 epicenter at Tamenglong (January 4, 2016), M 5.2 epicenter at Churachandpur (February 24, 2017) and most recent M 4.4 epicenter at Thoubal (June 19, 2017). In these recent earthquakes, some houses and buildings were damaged, landslides were also occurred. A field study was carried out. An overview of the various causal factors involved in triggering of earthquake in Manipur has been discussed. It is found that improper planning, poor design, negligence, structural irregularities, poor quality materials, construction of foundation without proper site soil investigation and non-implementation of remedial measures, etc., are possibly the main causal factors for damage in Manipur during earthquake. The study also suggests, though the proper design of structure and foundation along with soil investigation, ground improvement methods, use of modern techniques of construction, counseling with engineer, mass awareness, etc., might be effective solution to control the hazard in many locations. An overview on the analysis pertaining to earthquake in Manipur together with on-going detailed site specific geotechnical investigation were presented.Keywords: Manipur, earthquake, hazard, structure, soil
Procedia PDF Downloads 212424 The Culture of Journal Writing among Manobo Senior High School Students
Authors: Jessevel Montes
Abstract:
This study explored on the culture of journal writing among the Senior High School Manobo students. The purpose of this qualitative morpho-semantic and syntactic study was to discover the morphological, semantic, and syntactic features of the written output through morphological, semantic, and syntactic categories present in their journal writings. Also, beliefs and practices embedded in the norms, values, and ideologies were identified. The study was conducted among the Manobo students in the Senior High Schools of Central Mindanao, particularly in the Division of North Cotabato. Findings revealed that morphologically, the features that flourished are the following: subject-verb concordance, tenses, pronouns, prepositions, articles, and the use of adjectives. Semantically, the features are the following: word choice, idiomatic expression, borrowing, and vernacular. Syntactically, the features are the types of sentences according to structure and function; and the dominance of code switching and run-on sentences. Lastly, as to the beliefs and practices embedded in the norms, values, and ideologies of their journal writing, the major themes are: valuing education, family, and friends as treasure, preservation of culture, and emancipation from the bondage of poverty. This study has shed light on the writing capabilities and weaknesses of the Manobo students when it comes to English language. Further, such an insight into language learning problems is useful to teachers because it provides information on common trouble-spots in language learning, which can be used in the preparation of effective teaching materials.Keywords: applied linguistics, culture, morpho-semantic and syntactic analysis, Manobo Senior High School, Philippines
Procedia PDF Downloads 121423 Modelling Interactions between Saturated and Unsaturated Zones by Hydrus 1D, Plain of Kairouan, Central Tunisia
Authors: Mariem Saadi, Sabri Kanzari, Adel Zghibi
Abstract:
In semi-arid areas like the Kairouan region, the constant irrigation with saline water and the overuse of groundwater resources, soils and aquifers salinization has become an increasing concern. In this study, a methodology has been developed to evaluate the groundwater contamination risk based on the unsaturated zone hydraulic properties. Two soil profiles with different ranges of salinity, one located in the north of the plain and another one in the south of plain (each 30 m deep) and both characterized by direct recharge of the aquifer were chosen. Simulations were conducted with Hydrus-1D code using measured precipitation data for the period 1998-2003 and calculated evapotranspiration for both chosen profiles. Four combinations of initial conditions of water content and salt concentration were used for the simulation process in order to find the best match between simulated and measured values. The success of the calibration of Hydrus-1D allowed the investigation of some scenarios in order to assess the contamination risk under different natural conditions. The aquifer risk contamination is related to the natural conditions where it increased while facing climate change and temperature increase and decreased in the presence of a clay layer in the unsaturated zone. Hydrus-1D was a useful tool to predict the groundwater level and quality in the case of a direct recharge and in the absence of any information related to the soil layers except for the texture.Keywords: Hydrus-1D, Kairouan, salinization, semi-arid region, solute transport, unsaturated zone
Procedia PDF Downloads 183422 The Formulation of the Mecelle and Other Codified Laws in the Ottoman Empire: Transformation Overturning the Sharia Principles
Authors: Tianqi Yin
Abstract:
The sharia had been the legislative basis in the Ottoman Empire since its emergence. The authority of sharia was superlative in the Islamic society compared to the power of the sulta, the nominal ruler of the nation, regulating essentially every aspect of people’s lives according to an ethical code. In modernity, however, as European sovereignty employed forces to re-engineer the Islamic world to make it more like their own, a society ruled by a state, the Ottoman legislation system encountered a great challenge of adopting codified laws to replace sharia with the formulation of the Mecelle being a prominent case. Interpretations of this transformation have been contentious, with the key debate revolving around whether these codified laws are authentic representations of sharia or alien legal formulations authorized by the modern nation-state under heavy European colonial influence. Because of the difference in methodology of the diverse theories, challenges toward having a universal conclusion on this issue remain. This paper argues that the formulation of the Mecelle and other codified laws is a discontinuity of sharia due to European modernity’s influence and that the emphasis on elements of Islamic laws is a tactic employed to promote this process. These codified laws signals a complete social transformation from the Islamic society ruled by the sharia to a replication of the European society that is ruled by a comprehensive ruling system of the modern state. In addition to advancing the discussion on the characterization of the codification movement in the Ottoman Empire in modernity, the research also promotes the determination of the nature of the modern codification movement globally.Keywords: codification, mecelle, modernity, sharia, ottoman empire
Procedia PDF Downloads 91421 Timber Urbanism: Assessing the Carbon Footprint of Mass-Timber, Steel, and Concrete Structural Prototypes for Peri-Urban Densification in the Hudson Valley’s Urban Fringe
Authors: Eleni Stefania Kalapoda
Abstract:
The current fossil-fuel based urbanization pattern and the estimated human population growth are increasing the environmental footprint on our planet’s precious resources. To mitigate the estimated skyrocketing in greenhouse gas emissions associated with the construction of new cities and infrastructure over the next 50 years, we need a radical rethink in our approach to construction to deliver a net zero built environment. This paper assesses the carbon footprint of a mass-timber, a steel, and a concrete structural alternative for peri-urban densification in the Hudson Valley's urban fringe, along with examining the updated policy and the building code adjustments that support synergies between timber construction in city making and sustainable management of timber forests. By quantifying the carbon footprint of a structural prototype for four different material assemblies—a concrete (post-tensioned), a mass timber, a steel (composite), and a hybrid (timber/steel/concrete) assembly applicable to the three updated building typologies of the IBC 2021 (Type IV-A, Type IV-B, Type IV-C) that range between a nine to eighteen-story structure alternative—and scaling-up that structural prototype to the size of a neighborhood district, the paper presents a quantitative and a qualitative approach for a forest-based construction economy as well as a resilient and a more just supply chain framework that ensures the wellbeing of both the forest and its inhabitants.Keywords: mass-timber innovation, concrete structure, carbon footprint, densification
Procedia PDF Downloads 108420 Design of SAE J2716 Single Edge Nibble Transmission Digital Sensor Interface for Automotive Applications
Authors: Jongbae Lee, Seongsoo Lee
Abstract:
Modern sensors often embed small-size digital controller for sensor control, value calibration, and signal processing. These sensors require digital data communication with host microprocessors, but conventional digital communication protocols are too heavy for price reduction. SAE J2716 SENT (single edge nibble transmission) protocol transmits direct digital waveforms instead of complicated analog modulated signals. In this paper, a SENT interface is designed in Verilog HDL (hardware description language) and implemented in FPGA (field-programmable gate array) evaluation board. The designed SENT interface consists of frame encoder/decoder, configuration register, tick period generator, CRC (cyclic redundancy code) generator/checker, and TX/RX (transmission/reception) buffer. Frame encoder/decoder is implemented as a finite state machine, and it controls whole SENT interface. Configuration register contains various parameters such as operation mode, tick length, CRC option, pause pulse option, and number of nibble data. Tick period generator generates tick signals from input clock. CRC generator/checker generates or checks CRC in the SENT data frame. TX/RX buffer stores transmission/received data. The designed SENT interface can send or receives digital data in 25~65 kbps at 3 us tick. Synthesized in 0.18 um fabrication technologies, it is implemented about 2,500 gates.Keywords: digital sensor interface, SAE J2716, SENT, verilog HDL
Procedia PDF Downloads 302419 Simulation of the Collimator Plug Design for Prompt-Gamma Activation Analysis in the IEA-R1 Nuclear Reactor
Authors: Carlos G. Santos, Frederico A. Genezini, A. P. Dos Santos, H. Yorivaz, P. T. D. Siqueira
Abstract:
The Prompt-Gamma Activation Analysis (PGAA) is a valuable technique for investigating the elemental composition of various samples. However, the installation of a PGAA system entails specific conditions such as filtering the neutron beam according to the target and providing adequate shielding for both users and detectors. These requirements incur substantial costs, exceeding $100,000, including manpower. Nevertheless, a cost-effective approach involves leveraging an existing neutron beam facility to create a hybrid system integrating PGAA and Neutron Tomography (NT). The IEA-R1 nuclear reactor at IPEN/USP possesses an NT facility with suitable conditions for adapting and implementing a PGAA device. The NT facility offers a thermal flux slightly colder and provides shielding for user protection. The key additional requirement involves designing detector shielding to mitigate high gamma ray background and safeguard the HPGe detector from neutron-induced damage. This study employs Monte Carlo simulations with the MCNP6 code to optimize the collimator plug for PGAA within the IEA-R1 NT facility. Three collimator models are proposed and simulated to assess their effectiveness in shielding gamma and neutron radiation from nucleon fission. The aim is to achieve a focused prompt-gamma signal while shielding ambient gamma radiation. The simulation results indicate that one of the proposed designs is particularly suitable for the PGAA-NT hybrid system.Keywords: MCNP6.1, neutron, prompt-gamma ray, prompt-gamma activation analysis
Procedia PDF Downloads 75418 Towards Sustainable Evolution of Bioeconomy: The Role of Technology and Innovation Management
Authors: Ronald Orth, Johanna Haunschild, Sara Tsog
Abstract:
The bioeconomy is an inter- and cross-disciplinary field covering a large number and wide scope of existing and emerging technologies. It has a great potential to contribute to the transformation process of industry landscape and ultimately drive the economy towards sustainability. However, bioeconomy per se is not necessarily sustainable and technology should be seen as an enabler rather than panacea to all our ecological, social and economic issues. Therefore, to draw and maximize benefits from bioeconomy in terms of sustainability, we propose that innovative activities should encompass not only novel technologies and bio-based new materials but also multifocal innovations. For multifocal innovation endeavors, innovation management plays a substantial role, as any innovation emerges in a complex iterative process where communication and knowledge exchange among relevant stake holders has a pivotal role. The knowledge generation and innovation are although at the core of transition towards a more sustainable bio-based economy, to date, there is a significant lack of concepts and models that approach bioeconomy from the innovation management approach. The aim of this paper is therefore two-fold. First, it inspects the role of transformative approach in the adaptation of bioeconomy that contributes to the environmental, ecological, social and economic sustainability. Second, it elaborates the importance of technology and innovation management as a tool for smooth, prompt and effective transition of firms to the bioeconomy. We conduct a qualitative literature study on the sustainability challenges that bioeconomy entails thus far using Science Citation Index and based on grey literature, as major economies e.g. EU, USA, China and Brazil have pledged to adopt bioeconomy and have released extensive publications on the topic. We will draw an example on the forest based business sector that is transforming towards the new green economy more rapidly as expected, although this sector has a long-established conventional business culture with consolidated and fully fledged industry. Based on our analysis we found that a successful transition to sustainable bioeconomy is conditioned on heterogenous and contested factors in terms of stakeholders , activities and modes of innovation. In addition, multifocal innovations occur when actors from interdisciplinary fields engage in intensive and continuous interaction where the focus of innovation is allocated to a field of mutually evolving socio-technical practices that correspond to the aims of the novel paradigm of transformative innovation policy. By adopting an integrated and systems approach as well as tapping into various innovation networks and joining global innovation clusters, firms have better chance of creating an entire new chain of value added products and services. This requires professionals that have certain capabilities and skills such as: foresight for future markets, ability to deal with complex issues, ability to guide responsible R&D, ability of strategic decision making, manage in-depth innovation systems analysis including value chain analysis. Policy makers, on the other hand, need to acknowledge the essential role of firms in the transformative innovation policy paradigm.Keywords: bioeconomy, innovation and technology management, multifocal innovation, sustainability, transformative innovation policy
Procedia PDF Downloads 125417 International Classification of Primary Care as a Reference for Coding the Demand for Care in Primary Health Care
Authors: Souhir Chelly, Chahida Harizi, Aicha Hechaichi, Sihem Aissaoui, Leila Ben Ayed, Maha Bergaoui, Mohamed Kouni Chahed
Abstract:
Introduction: The International Classification of Primary Care (ICPC) is part of the morbidity classification system. It had 17 chapters, and each is coded by an alphanumeric code: the letter corresponds to the chapter, the number to a paragraph in the chapter. The objective of this study is to show the utility of this classification in the coding of the reasons for demand for care in Primary health care (PHC), its advantages and limits. Methods: This is a cross-sectional descriptive study conducted in 4 PHC in Ariana district. Data on the demand for care during 2 days in the same week were collected. The coding of the information was done according to the CISP. The data was entered and analyzed by the EPI Info 7 software. Results: A total of 523 demands for care were investigated. The patients who came for the consultation are predominantly female (62.72%). Most of the consultants are young with an average age of 35 ± 26 years. In the ICPC, there are 7 rubrics: 'infections' is the most common reason with 49.9%, 'other diagnoses' with 40.2%, 'symptoms and complaints' with 5.5%, 'trauma' with 2.1%, 'procedures' with 2.1% and 'neoplasm' with 0.3%. The main advantage of the ICPC is the fact of being a standardized tool. It is very suitable for classification of the reasons for demand for care in PHC according to their specificity, capacity to be used in a computerized medical file of the PHC. Its current limitations are related to the difficulty of classification of some reasons for demand for care. Conclusion: The ICPC has been developed to provide healthcare with a coding reference that takes into account their specificity. The CIM is in its 10th revision; it would gain from revision to revision to be more efficient to be generalized and used by the teams of PHC.Keywords: international classification of primary care, medical file, primary health care, Tunisia
Procedia PDF Downloads 267416 Numerical Investigation of Pressure Drop and Erosion Wear by Computational Fluid Dynamics Simulation
Authors: Praveen Kumar, Nitin Kumar, Hemant Kumar
Abstract:
The modernization of computer technology and commercial computational fluid dynamic (CFD) simulation has given better detailed results as compared to experimental investigation techniques. CFD techniques are widely used in different field due to its flexibility and performance. Evaluation of pipeline erosion is complex phenomenon to solve by numerical arithmetic technique, whereas CFD simulation is an easy tool to resolve that type of problem. Erosion wear behaviour due to solid–liquid mixture in the slurry pipeline has been investigated using commercial CFD code in FLUENT. Multi-phase Euler-Lagrange model was adopted to predict the solid particle erosion wear in 22.5° pipe bend for the flow of bottom ash-water suspension. The present study addresses erosion prediction in three dimensional 22.5° pipe bend for two-phase (solid and liquid) flow using finite volume method with standard k-ε turbulence, discrete phase model and evaluation of erosion wear rate with varying velocity 2-4 m/s. The result shows that velocity of solid-liquid mixture found to be highly dominating parameter as compared to solid concentration, density, and particle size. At low velocity, settling takes place in the pipe bend due to low inertia and gravitational effect on solid particulate which leads to high erosion at bottom side of pipeline.Keywords: computational fluid dynamics (CFD), erosion, slurry transportation, k-ε Model
Procedia PDF Downloads 408415 Perforation Analysis of the Aluminum Alloy Sheets Subjected to High Rate of Loading and Heated Using Thermal Chamber: Experimental and Numerical Approach
Authors: A. Bendarma, T. Jankowiak, A. Rusinek, T. Lodygowski, M. Klósak, S. Bouslikhane
Abstract:
The analysis of the mechanical characteristics and dynamic behavior of aluminum alloy sheet due to perforation tests based on the experimental tests coupled with the numerical simulation is presented. The impact problems (penetration and perforation) of the metallic plates have been of interest for a long time. Experimental, analytical as well as numerical studies have been carried out to analyze in details the perforation process. Based on these approaches, the ballistic properties of the material have been studied. The initial and residual velocities laser sensor is used during experiments to obtain the ballistic curve and the ballistic limit. The energy balance is also reported together with the energy absorbed by the aluminum including the ballistic curve and ballistic limit. The high speed camera helps to estimate the failure time and to calculate the impact force. A wide range of initial impact velocities from 40 up to 180 m/s has been covered during the tests. The mass of the conical nose shaped projectile is 28 g, its diameter is 12 mm, and the thickness of the aluminum sheet is equal to 1.0 mm. The ABAQUS/Explicit finite element code has been used to simulate the perforation processes. The comparison of the ballistic curve was obtained numerically and was verified experimentally, and the failure patterns are presented using the optimal mesh densities which provide the stability of the results. A good agreement of the numerical and experimental results is observed.Keywords: aluminum alloy, ballistic behavior, failure criterion, numerical simulation
Procedia PDF Downloads 312414 Utilizing Minecraft Java Edition for the Application of Fire Disaster Procedures to Establish Fire Disaster Readiness for Grade 12 STEM students of DLSU-IS
Authors: Aravella Flores, Jose Rafael E. Sotelo, Luis Romulus Phillippe R. Javier, Josh Christian V. Nunez
Abstract:
This study focuses on analyzing the performance of Grade 12 STEM students of De La Salle University - Integrated School that has completed the Disaster Readiness and Risk Reduction course in handling fire hazards through Minecraft Java Edition. This platform is suitable because fire DRRR is challenging to learn in a practical setting as well as questionable with regard to supplementing the successful implementation of textbook knowledge into actual practice. The purpose of this study is to acknowledge whether Minecraft can be a suitable environment to familiarize oneself to fire DRRR. The objectives are achieved through utilizing Minecraft in simulating fire scenarios which allows the participants to freely act upon and practice fire DRRR. The experiment was divided into the grounding and validation phase, where researchers observed the performance of the participants in the simulation. A pre-simulation and post-simulation survey was given to acknowledge the change in participants’ perception of being able to utilize fire DRRR procedures and their vulnerabilities. The paired t-test was utilized, showing significant differences in the pre-simulation and post-simulation survey scores, thus, insinuating improved judgment of DRRR, lessening their vulnerabilities in the possibility of encountering a fire hazard. This research poses a model for future research which can gather more participants and dwell on more complex codes outside just command blocks and into the code lines of Minecraft itself.Keywords: minecraft, DRRR, fire, disaster, simulation
Procedia PDF Downloads 137413 Combined Civilian and Military Disaster Response: A Critical Analysis of the 2010 Haiti Earthquake Relief Effort
Authors: Matthew Arnaouti, Michael Baird, Gabrielle Cahill, Tamara Worlton, Michelle Joseph
Abstract:
Introduction: Over ten years after the 7.0 magnitude Earthquake struck the capital of Haiti, impacting over three million people and leading to the deaths of over two hundred thousand, the multinational humanitarian response remains the largest disaster relief effort to date. This study critically evaluates the multi-sector and multinational disaster response to the Earthquake, looking at how the lessons learned from this analysis can be applied to future disaster response efforts. We put particular emphasis on assessing the interaction between civilian and military sectors during this humanitarian relief effort, with the hopes of highlighting how concrete guidelines are essential to improve future responses. Methods: An extensive scoping review of the relevant literature was conducted - where library scientists conducted reproducible, verified systematic searches of multiple databases. Grey literature and hand searches were utilised to identify additional unclassified military documents, for inclusion in the study. More than 100 documents were included for data extraction and analysis. Key domains were identified, these included: Humanitarian and Military Response, Communication, Coordination, Resources, Needs Assessment and Pre-Existing Policy. Corresponding information and lessons-learned pertaining to these domains was then extracted - detailing the barriers and facilitators to an effective response. Results: Multiple themes were noted which stratified all identified domains - including the lack of adequate pre-existing policy, as well as extensive ambiguity of actors’ roles. This ambiguity was continually influenced by the complex role the United States military played in the disaster response. At a deeper level, the effects of neo-colonialism and concern about infringements on Haitian sovereignty played a substantial role at all levels: setting the pre-existing conditions and determining the redevelopment efforts that followed. Furthermore, external factors significantly impacted the response, particularly the loss of life within the political and security sectors. This was compounded by the destruction of important infrastructure systems - particularly electricity supplies and telecommunication networks, as well as air and seaport capabilities. Conclusions: This study stands as one of the first and most comprehensive evaluations, systematically analysing the civilian and military response - including their collaborative efforts. This study offers vital information for improving future combined responses and provides a significant opportunity for advancing knowledge in disaster relief efforts - which remains a more pressing issue than ever. The categories and domains formulated serve to highlight interdependent factors that should be applied in future disaster responses, with significant potential to aid the effective performance of humanitarian actors. Further studies will be grounded in these findings, particularly the need for greater inclusion of the Haitian perspective in the literature, through additional qualitative research studies.Keywords: civilian and military collaboration, combined response, disaster, disaster response, earthquake, Haiti, humanitarian response
Procedia PDF Downloads 127412 Recognition and Counting Algorithm for Sub-Regional Objects in a Handwritten Image through Image Sets
Authors: Kothuri Sriraman, Mattupalli Komal Teja
Abstract:
In this paper, a novel algorithm is proposed for the recognition of hulls in a hand written images that might be irregular or digit or character shape. Identification of objects and internal objects is quite difficult to extract, when the structure of the image is having bulk of clusters. The estimation results are easily obtained while going through identifying the sub-regional objects by using the SASK algorithm. Focusing mainly to recognize the number of internal objects exist in a given image, so as it is shadow-free and error-free. The hard clustering and density clustering process of obtained image rough set is used to recognize the differentiated internal objects, if any. In order to find out the internal hull regions it involves three steps pre-processing, Boundary Extraction and finally, apply the Hull Detection system. By detecting the sub-regional hulls it can increase the machine learning capability in detection of characters and it can also be extend in order to get the hull recognition even in irregular shape objects like wise black holes in the space exploration with their intensities. Layered hulls are those having the structured layers inside while it is useful in the Military Services and Traffic to identify the number of vehicles or persons. This proposed SASK algorithm is helpful in making of that kind of identifying the regions and can useful in undergo for the decision process (to clear the traffic, to identify the number of persons in the opponent’s in the war).Keywords: chain code, Hull regions, Hough transform, Hull recognition, Layered Outline Extraction, SASK algorithm
Procedia PDF Downloads 349411 Cache Analysis and Software Optimizations for Faster on-Chip Network Simulations
Authors: Khyamling Parane, B. M. Prabhu Prasad, Basavaraj Talawar
Abstract:
Fast simulations are critical in reducing time to market in CMPs and SoCs. Several simulators have been used to evaluate the performance and power consumed by Network-on-Chips. Researchers and designers rely upon these simulators for design space exploration of NoC architectures. Our experiments show that simulating large NoC topologies take hours to several days for completion. To speed up the simulations, it is necessary to investigate and optimize the hotspots in simulator source code. Among several simulators available, we choose Booksim2.0, as it is being extensively used in the NoC community. In this paper, we analyze the cache and memory system behaviour of Booksim2.0 to accurately monitor input dependent performance bottlenecks. Our measurements show that cache and memory usage patterns vary widely based on the input parameters given to Booksim2.0. Based on these measurements, the cache configuration having least misses has been identified. To further reduce the cache misses, we use software optimization techniques such as removal of unused functions, loop interchanging and replacing post-increment operator with pre-increment operator for non-primitive data types. The cache misses were reduced by 18.52%, 5.34% and 3.91% by employing above technology respectively. We also employ thread parallelization and vectorization to improve the overall performance of Booksim2.0. The OpenMP programming model and SIMD are used for parallelizing and vectorizing the more time-consuming portions of Booksim2.0. Speedups of 2.93x and 3.97x were observed for the Mesh topology with 30 × 30 network size by employing thread parallelization and vectorization respectively.Keywords: cache behaviour, network-on-chip, performance profiling, vectorization
Procedia PDF Downloads 197410 The Influence of Immunity on the Behavior and Dignity of Judges
Authors: D. Avnieli
Abstract:
Immunity of judges from liability represents a departure from the principle that all are equal under the law, and that victims may be granted compensation from their offenders. The purpose of the study is to determine if judicial immunity coincides with the need to ensure the existence of highly independent and incorruptible judiciary. Judges are immune from civil and criminal liability for their judicial acts. Judicial immunity is justified by the need to maintain complete independence and discretion of the judiciary. Scholars and judges believe that absolute immunity is needed to shield judges from pressures, threats, or outside interference. It is commonly accepted, that judges should be free to perform their judicial role in accordance with their assessment of the fact and their understanding of the law, without any restrictions, influences, inducements or interferences. In most countries, immunity applies when judges act in excess of jurisdiction. In some countries, it applies even when they act maliciously or corruptly. The only exception to absolute immunity applicable in all judicial systems is when judges act without jurisdiction over the subject matter. The Israeli Supreme Court recently decided to embrace absolute immunity and strike off a lawsuit of a refugee, who was unlawfully incarcerated. The Court ruled that the plaintiff cannot sue the State or the judge for damages. The questions of malice, dignity, and public scrutiny were not discussed. This paper, based on comparative analysis of many cases, aims to determine if immunity affects the dignity and behavior of judges. It demonstrates that most judges maintain their dignity and ethical code of behavior, but sometimes do not hesitate to act consciously in excess of jurisdiction, and in rare cases even corruptly. Therefore, in order to maintain independent and incorruptible judiciary, immunity should not be applied where judges act consciously in excess of jurisdiction or with malicious incentives.Keywords: incorruptible judiciary, immunity, independent, judicial, judges, jurisdiction
Procedia PDF Downloads 105409 Rainwater Management: A Case Study of Residential Reconstruction of Cultural Heritage Buildings in Russia
Authors: V. Vsevolozhskaia
Abstract:
Since 1990, energy-efficient development concepts have constituted both a turning point in civil engineering and a challenge for an environmentally friendly future. Energy and water currently play an essential role in the sustainable economic growth of the world in general and Russia in particular: the efficiency of the water supply system is the second most important parameter for energy consumption according to the British assessment method, while the water-energy nexus has been identified as a focus for accelerating sustainable growth and developing effective, innovative solutions. The activities considered in this study were aimed at organizing and executing the renovation of the property in residential buildings located in St. Petersburg, specifically buildings with local or federal historical heritage status under the control of the St. Petersburg Committee for the State Inspection and Protection of Historic and Cultural Monuments (KGIOP) and UNESCO. Even after reconstruction, these buildings still fall into energy efficiency class D. Russian Government Resolution No. 87 on the structure and required content of project documentation contains a section entitled ‘Measures to ensure compliance with energy efficiency and equipment requirements for buildings, structures, and constructions with energy metering devices’. Mention is made of the need to install collectors and meters, which only calculate energy, neglecting the main purpose: to make buildings more energy-efficient, potentially even energy efficiency class A. The least-explored aspects of energy-efficient technology in the Russian Federation remain the water balance and the possibility of implementing rain and meltwater collection systems. These modern technologies are used exclusively for new buildings due to a lack of government directive to create project documentation during the planning of major renovations and reconstruction that would include the collection and reuse of rainwater. Energy-efficient technology for rain and meltwater collection is currently applied only to new buildings, even though research has proved that using rainwater is safe and offers a huge step forward in terms of eco-efficiency analysis and water innovation. Where conservation is mandatory, making changes to protected sites is prohibited. In most cases, the protected site is the cultural heritage building itself, including the main walls and roof. However, the installation of a second water supply system and collection of rainwater would not affect the protected building itself. Water efficiency in St. Petersburg is currently considered only from the point of view of the installation that regulates the flow of the pipeline shutoff valves. The development of technical guidelines for the use of grey- and/or rainwater to meet the needs of residential buildings during reconstruction or renovation is not yet complete. The ideas for water treatment, collection and distribution systems presented in this study should be taken into consideration during the reconstruction or renovation of residential cultural heritage buildings under the protection of KGIOP and UNESCO. The methodology applied also has the potential to be extended to other cultural heritage sites in northern countries and lands with an average annual rainfall of over 600 mm to cover average toilet-flush needs.Keywords: cultural heritage, energy efficiency, renovation, rainwater collection, reconstruction, water management, water supply
Procedia PDF Downloads 92408 Linux Security Management: Research and Discussion on Problems Caused by Different Aspects
Authors: Ma Yuzhe, Burra Venkata Durga Kumar
Abstract:
The computer is a great invention. As people use computers more and more frequently, the demand for PCs is growing, and the performance of computer hardware is also rising to face more complex processing and operation. However, the operating system, which provides the soul for computers, has stopped developing at a stage. In the face of the high price of UNIX (Uniplexed Information and Computering System), batch after batch of personal computer owners can only give up. Disk Operating System is too simple and difficult to bring innovation into play, which is not a good choice. And MacOS is a special operating system for Apple computers, and it can not be widely used on personal computers. In this environment, Linux, based on the UNIX system, was born. Linux combines the advantages of the operating system and is composed of many microkernels, which is relatively powerful in the core architecture. Linux system supports all Internet protocols, so it has very good network functions. Linux supports multiple users. Each user has no influence on their own files. Linux can also multitask and run different programs independently at the same time. Linux is a completely open source operating system. Users can obtain and modify the source code for free. Because of these advantages of Linux, it has also attracted a large number of users and programmers. The Linux system is also constantly upgraded and improved. It has also issued many different versions, which are suitable for community use and commercial use. Linux system has good security because it relies on a file partition system. However, due to the constant updating of vulnerabilities and hazards, the using security of the operating system also needs to be paid more attention to. This article will focus on the analysis and discussion of Linux security issues.Keywords: Linux, operating system, system management, security
Procedia PDF Downloads 108407 Herbal Medicinal Materials for Health/Functional Foods in Korea
Authors: Chang-Hwan Oh, Young-Jong Lee
Abstract:
In April, 2015, the Ministry of Food and Drug Safety’s announcement that only 10 of the 207 products that list Cynanchum Wilfordii Radix among their ingredients were confirmed to actually contain “iyeobupiso” the counterfeit version of the “baeksuo” raised a fog to consumers who purchased health/functional foods supposedly containing the herbal medicinal material, “baeksuo” in Korean. Baeksuo is the main ingredient of the product “EstroG-100” that contain Phlomis umbrosa and Angelica gigas too (NaturalEndoTech, S.Korea). The hot water extract of the herbal medicinal materials (HMM) was approved as a product specific Health/Functional Food (HFF) having a helpful function to women reaching menopause by Korea Food & Drug Administration (Ministry of Food & Drug Safety at present). The origin of “baeksuo” is the root of Cynanchum wilfordii Hemsley in Korea (But “iyeobupiso, the root of Cynanchum auriculatum Royle ex Wight is considered as the origin of “baeksuo” in China). In Korea, about 116 HMMs are listed as the food materials in Korea Food Code among the total 187 HMMs could be used for food and medicine purpose simultaneously. But there are some chances of the HMMs (shared use for food and medicine purpose) could be misused by the part and HMMs not permitted for HFF such as the “baeksuo” case. In this study, some of HMMs (shared use for food and medicine purpose) are examined to alleviate the misuse chance of HMMs for HFFs in Korea. For the purpose of this study, the origin, shape, edible parts, efficacy and the side effects of the similar HMMs to be misused for HFF are investigated.Keywords: herbal medicinal materials, healthy/functional foods, misuse, shared use
Procedia PDF Downloads 291406 Pre and Post IFRS Loss Avoidance in France and the United Kingdom
Authors: T. Miková
Abstract:
This paper analyzes the effect of a single uniform accounting rule on reporting quality by investigating the influence of IFRS on earnings management. This paper examines whether earnings management is reduced after IFRS adoption through the use of “loss avoidance thresholds”, a method that has been verified in earlier studies. This paper concentrates on two European countries: one that represents the continental code law tradition with weak protection of investors (France) and one that represents the Anglo-American common law tradition, which typically implies a strong enforcement system (the United Kingdom). The research investigates a sample of 526 companies (6822 firm-year observations) during the years 2000 – 2013. The results are different for the two jurisdictions. This study demonstrates that a single set of accounting standards contributes to better reporting quality and reduces the pervasiveness of earnings management in France. In contrast, there is no evidence that a reduction in earnings management followed the implementation of IFRS in the United Kingdom. Due to the fact that IFRS benefit France but not the United Kingdom, other political and economic factors, such legal system or capital market strength, must play a significant role in influencing the comparability and transparency cross-border companies’ financial statements. Overall, the result suggests that IFRS moderately contribute to the accounting quality of reported financial statements and bring benefit for stakeholders, though the role played by other economic factors cannot be discounted.Keywords: accounting standards, earnings management, international financial reporting standards, loss avoidance, reporting quality
Procedia PDF Downloads 198405 Application of Finite Volume Method for Numerical Simulation of Contaminant Transfer in a Two-Dimensional Reservoir
Authors: Atousa Ataieyan, Salvador A. Gomez-Lopera, Gennaro Sepede
Abstract:
Today, due to the growing urban population and consequently, the increasing water demand in cities, the amount of contaminants entering the water resources is increasing. This can impose harmful effects on the quality of the downstream water. Therefore, predicting the concentration of discharged pollutants at different times and distances of the interested area is of high importance in order to carry out preventative and controlling measures, as well as to avoid consuming the contaminated water. In this paper, the concentration distribution of an injected conservative pollutant in a square reservoir containing four symmetric blocks and three sources using Finite Volume Method (FVM) is simulated. For this purpose, after estimating the flow velocity, classical Advection-Diffusion Equation (ADE) has been discretized over the studying domain by Backward Time- Backward Space (BTBS) scheme. Then, the discretized equations for each node have been derived according to the initial condition, boundary conditions and point contaminant sources. Finally, taking into account the appropriate time step and space step, a computational code was set up in MATLAB. Contaminant concentration was then obtained at different times and distances. Simulation results show how using BTBS differentiating scheme and FVM as a numerical method for solving the partial differential equation of transport is an appropriate approach in the case of two-dimensional contaminant transfer in an advective-diffusive flow.Keywords: BTBS differentiating scheme, contaminant concentration, finite volume, mass transfer, water pollution
Procedia PDF Downloads 135404 The DAQ Debugger for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework
Procedia PDF Downloads 284403 Large Eddy Simulation with Energy-Conserving Schemes: Understanding Wind Farm Aerodynamics
Authors: Dhruv Mehta, Alexander van Zuijlen, Hester Bijl
Abstract:
Large Eddy Simulation (LES) numerically resolves the large energy-containing eddies of a turbulent flow, while modelling the small dissipative eddies. On a wind farm, these large scales carry the energy wind turbines extracts and are also responsible for transporting the turbines’ wakes, which may interact with downstream turbines and certainly with the atmospheric boundary layer (ABL). In this situation, it is important to conserve the energy that these wake’s carry and which could be altered artificially through numerical dissipation brought about by the schemes used for the spatial discretisation and temporal integration. Numerical dissipation has been reported to cause the premature recovery of turbine wakes, leading to an over prediction in the power produced by wind farms.An energy-conserving scheme is free from numerical dissipation and ensures that the energy of the wakes is increased or decreased only by the action of molecular viscosity or the action of wind turbines (body forces). The aim is to create an LES package with energy-conserving schemes to simulate wind turbine wakes correctly to gain insight into power-production, wake meandering etc. Such knowledge will be useful in designing more efficient wind farms with minimal wake interaction, which if unchecked could lead to major losses in energy production per unit area of the wind farm. For their research, the authors intend to use the Energy-Conserving Navier-Stokes code developed by the Energy Research Centre of the Netherlands.Keywords: energy-conserving schemes, modelling turbulence, Large Eddy Simulation, atmospheric boundary layer
Procedia PDF Downloads 465402 Environmental Effect on Corrosion Fatigue Behaviors of Steam Generator Forging in Simulated Pressurized Water Reactor Environment
Authors: Yakui Bai, Chen Sun, Ke Wang
Abstract:
An experimental investigation of environmental effect on fatigue behavior in SA508 Gr.3 Cl.2 Steam Generator Forging CAP1400 nuclear power plant has been carried out. In order to simulate actual loading condition, a range of strain amplitude was applied in different low cycle fatigue (LCF) tests. The current American Society of Mechanical Engineers (ASME) design fatigue code does not take full account of the interactions of environmental, loading, and material's factors. A range of strain amplitude was applied in different low cycle fatigue (LCF) tests at a strain rate of 0.01%s⁻¹. A design fatigue model was constructed by taking environmentally assisted fatigue effects into account, and the corresponding design curves were given for the convenience of engineering applications. The corrosion fatigue experiment was performed in a strain control mode in 320℃ borated and lithiated water environment to evaluate the effects of a mixed environment on fatigue life. Stress corrosion cracking (SCC) in steam generator large forging in primary water of pressurized water reactor was also observed. In addition, it is found that the CF life of SA508 Gr.3 Cl.2 decreases with increasing temperature in the water environment. The relationship between the reciprocal of temperature and the logarithm of fatigue life was found to be linear. Through experiments and subsequent analysis, the mechanisms of reduced low cycle fatigue life have been investigated for steam generator forging.Keywords: failure behavior, low alloy steel, steam generator forging, stress corrosion cracking
Procedia PDF Downloads 125