Search results for: spiral model
14478 Mastering Digitization: A Quality-Adapted Digital Transformation Model
Authors: Franziska Schaefer, Marlene Kuhn, Heiner Otten
Abstract:
In the very near future, digitization will be the main challenge a company has to master to survive in a highly competitive market. Developing the right transformation strategy by considering all relevant aspects determines the success or failure of a company. Especially the digital focus on the customer plays a key role in creating sustainable competitive advantages, also leading to new tasks within the quality management. Therefore, quality management needs to be particularly addressed to support the upcoming digital change. In this paper, we present an analysis of existing digital transformation approaches and derive a transformation strategy from a quality management perspective. We identify and classify different transformation dimensions and assess their relevance to quality management tasks, resulting in a quality-adapted digital transformation model. Furthermore, we introduce applicable and customized quality management methods to support the presented digital transformation tasks. With our developed model we provide a digital transformation guideline from a quality perspective to master future disruptive changes.Keywords: digital transformation, digitization, quality management, strategy
Procedia PDF Downloads 47614477 Health Risk Assessment of Exposing to Benzene in Office Building around a Chemical Industry Based on Numerical Simulation
Authors: Majid Bayatian, Mohammadreza Ashouri
Abstract:
Releasing hazardous chemicals is one of the major problems for office buildings in the chemical industry and, therefore, environmental risks are inherent to these environments. The adverse health effects of the airborne concentration of benzene have been a matter of significant concern, especially in oil refineries. The chronic and acute adverse health effects caused by benzene exposure have attracted wide attention. Acute exposure to benzene through inhalation could cause headaches, dizziness, drowsiness, and irritation of the skin. Chronic exposures have reported causing aplastic anemia and leukemia at the occupational settings. Association between chronic occupational exposure to benzene and the development of aplastic anemia and leukemia were documented by several epidemiological studies. Numerous research works have investigated benzene emissions and determined benzene concentration at different locations of the refinery plant and stated considerable health risks. The high cost of industrial control measures requires justification through lifetime health risk assessment of exposed workers and the public. In the present study, a Computational Fluid Dynamics (CFD) model has been proposed to assess the exposure risk of office building around a refinery due to its release of benzene. For simulation, GAMBIT, FLUENT, and CFD Post software were used as pre-processor, processor, and post-processor, and the model was validated based on comparison with experimental results of benzene concentration and wind speed. Model validation results showed that the model is highly validated, and this model can be used for health risk assessment. The simulation and risk assessment results showed that benzene could be dispersion to an office building nearby, and the exposure risk has been unacceptable. According to the results of this study, a validated CFD model, could be very useful for decision-makers for control measures and possibly support them for emergency planning of probable accidents. Also, this model can be used to assess exposure to various types of accidents as well as other pollutants such as toluene, xylene, and ethylbenzene in different atmospheric conditions.Keywords: health risk assessment, office building, Benzene, numerical simulation, CFD
Procedia PDF Downloads 12614476 Optimization of a Convolutional Neural Network for the Automated Diagnosis of Melanoma
Authors: Kemka C. Ihemelandu, Chukwuemeka U. Ihemelandu
Abstract:
The incidence of melanoma has been increasing rapidly over the past two decades, making melanoma a current public health crisis. Unfortunately, even as screening efforts continue to expand in an effort to ameliorate the death rate from melanoma, there is a need to improve diagnostic accuracy to decrease misdiagnosis. Artificial intelligence (AI) a new frontier in patient care has the ability to improve the accuracy of melanoma diagnosis. Convolutional neural network (CNN) a form of deep neural network, most commonly applied to analyze visual imagery, has been shown to outperform the human brain in pattern recognition. However, there are noted limitations with the accuracy of the CNN models. Our aim in this study was the optimization of convolutional neural network algorithms for the automated diagnosis of melanoma. We hypothesized that Optimal selection of the momentum and batch hyperparameter increases model accuracy. Our most successful model developed during this study, showed that optimal selection of momentum of 0.25, batch size of 2, led to a superior performance and a faster model training time, with an accuracy of ~ 83% after nine hours of training. We did notice a lack of diversity in the dataset used, with a noted class imbalance favoring lighter vs. darker skin tone. Training set image transformations did not result in a superior model performance in our study.Keywords: melanoma, convolutional neural network, momentum, batch hyperparameter
Procedia PDF Downloads 9914475 A Stochastic Model to Predict Earthquake Ground Motion Duration Recorded in Soft Soils Based on Nonlinear Regression
Authors: Issam Aouari, Abdelmalek Abdelhamid
Abstract:
For seismologists, the characterization of seismic demand should include the amplitude and duration of strong shaking in the system. The duration of ground shaking is one of the key parameters in earthquake resistant design of structures. This paper proposes a nonlinear statistical model to estimate earthquake ground motion duration in soft soils using multiple seismicity indicators. Three definitions of ground motion duration proposed by literature have been applied. With a comparative study, we select the most significant definition to use for predict the duration. A stochastic model is presented for the McCann and Shah Method using nonlinear regression analysis based on a data set for moment magnitude, source to site distance and site conditions. The data set applied is taken from PEER strong motion databank and contains shallow earthquakes from different regions in the world; America, Turkey, London, China, Italy, Chili, Mexico...etc. Main emphasis is placed on soft site condition. The predictive relationship has been developed based on 600 records and three input indicators. Results have been compared with others published models. It has been found that the proposed model can predict earthquake ground motion duration in soft soils for different regions and sites conditions.Keywords: duration, earthquake, prediction, regression, soft soil
Procedia PDF Downloads 15114474 Towards a Complete Automation Feature Recognition System for Sheet Metal Manufacturing
Authors: Bahaa Eltahawy, Mikko Ylihärsilä, Reino Virrankoski, Esko Petäjä
Abstract:
Sheet metal processing is automated, but the step from product models to the production machine control still requires human intervention. This may cause time consuming bottlenecks in the production process and increase the risk of human errors. In this paper we present a system, which automatically recognizes features from the CAD-model of the sheet metal product. By using these features, the system produces a complete model of the particular sheet metal product. Then the model is used as an input for the sheet metal processing machine. Currently the system is implemented, capable to recognize more than 11 of the most common sheet metal structural features, and the procedure is fully automated. This provides remarkable savings in the production time, and protects against the human errors. This paper presents the developed system architecture, applied algorithms and system software implementation and testing.Keywords: feature recognition, automation, sheet metal manufacturing, CAD, CAM
Procedia PDF Downloads 35314473 Experimental and Computational Investigations of Baffle Position Effects on the Performance of Oil and Water Separator Tanks
Authors: Haitham A. Hussein, Rozi Abdullah, Md Azlin Md Said
Abstract:
Gravity separator tanks are used to separate oil from water in treatment units. Achieving the best flow uniformity in a separator tank will improve the maximum removal efficiency of oil globules from water. In this study, the effect on hydraulic performance of different baffle structure positions inside a tank was investigated. Experimental data and 2D computation fluid dynamics were used for analysis. In the numerical model, two-phase flow (drift flux model) was used to validate one-phase flow. For laboratory measurements, the velocity fields were measured using an acoustic Doppler velocimeter. The measurements were compared with the result of the computational model. The results of the experimental and computational simulations indicate that the best location of a baffle structure is achieved when the standard deviation of the velocity profile and the volume of the circulation zone inside the tank are minimized.Keywords: gravity separator tanks, CFD, baffle position, two phase flow, ADV, oil droplet
Procedia PDF Downloads 32614472 Seismic Behavior of Pile-Supported Bridges Considering Soil-Structure Interaction and Structural Non-Linearity
Authors: Muhammad Tariq A. Chaudhary
Abstract:
Soil-structure interaction (SSI) in bridges under seismic excitation is a complex phenomenon which involves coupling between the non-linear behavior of bridge pier columns and SSI in the soil-foundation part. It is a common practice in the study of SSI to model the bridge piers as linear elastic while treating the soil and foundation with a non-linear or an equivalent linear modeling approach. Consequently, the contribution of soil and foundation to the SSI phenomenon is disproportionately highlighted. The present study considered non-linear behavior of bridge piers in FEM model of a 4-span, pile-supported bridge that was designed for five different soil conditions in a moderate seismic zone. The FEM model of the bridge system was subjected to a suite of 21 actual ground motions representative of three levels of earthquake hazard (i.e. Design Basis Earthquake, Functional Evaluation Earthquake and Maximum Considered Earthquake). Results of the FEM analysis were used to delineate the influence of pier column non-linearity and SSI on critical design parameters of the bridge system. It was found that pier column non-linearity influenced the bridge lateral displacement and base shear more than SSI for majority of the analysis cases for the class of bridge investigated in the study.Keywords: bridge, FEM model, reinforced concrete pier, pile foundation, seismic loading, soil-structure interaction
Procedia PDF Downloads 23114471 Using Confirmatory Factor Analysis to Test the Dimensional Structure of Tourism Service Quality
Authors: Ibrahim A. Elshaer, Alaa M. Shaker
Abstract:
Several previous empirical studies have operationalized service quality as either a multidimensional or unidimensional construct. While few earlier studies investigated some practices of the assumed dimensional structure of service quality, no study has been found to have tested the construct’s dimensionality using confirmatory factor analysis (CFA). To gain a better insight into the dimensional structure of service quality construct, this paper tests its dimensionality using three CFA models (higher order factor model, oblique factor model, and one factor model) on a set of data collected from 390 British tourists visited Egypt. The results of the three tests models indicate that service quality construct is multidimensional. This result helps resolving the problems that might arise from the lack of clarity concerning the dimensional structure of service quality, as without testing the dimensional structure of a measure, researchers cannot assume that the significant correlation is a result of factors measuring the same construct.Keywords: service quality, dimensionality, confirmatory factor analysis, Egypt
Procedia PDF Downloads 58914470 Effective Validation Model and Use of Mobile-Health Apps for Elderly People
Authors: Leonardo Ramirez Lopez, Edward Guillen Pinto, Carlos Ramos Linares
Abstract:
The controversy brought about by the increasing use of mHealth apps and their effectiveness for disease prevention and diagnosis calls for immediate control. Although a critical topic in research areas such as medicine, engineering, economics, among others, this issue lacks reliable implementation models. However, projects such as Open Web Application Security Project (OWASP) and various studies have helped to create useful and reliable apps. This research is conducted under a quality model to optimize two mHealth apps for older adults. Results analysis on the use of two physical activity monitoring apps - AcTiv (physical activity) and SMCa (energy expenditure) - is positive and ideal. Through a theoretical and practical analysis, precision calculations and personal information control of older adults for disease prevention and diagnosis were performed. Finally, apps are validated by a physician and, as a result, they may be used as health monitoring tools in physical performance centers or any other physical activity. The results obtained provide an effective validation model for this type of mobile apps, which, in turn, may be applied by other software developers that along with medical staff would offer digital healthcare tools for elderly people.Keywords: model, validation, effective, healthcare, elderly people, mobile app
Procedia PDF Downloads 21714469 Authoring of Augmented Reality Manuals for Not Physically Available Products
Authors: Vito M. Manghisi, Michele Gattullo, Alessandro Evangelista, Enricoandrea Laviola
Abstract:
In this work, we compared two solutions for displaying a demo version of an Augmented Reality (AR) manual when the real product is not available, opting to replace it with its computer-aided design (CAD) model. AR has been proved to be effective in maintenance and assembly operations by many studies in the literature. However, most of them present solutions for existing products, usually converting old, printed manuals into AR manuals. In this case, authoring consists of defining how to convey existing instructions through AR. It is not a simple choice, and demo versions are created to test the design goodness. However, this becomes impossible when the product is not physically available, as for new products. A solution could be creating an entirely virtual environment with the product and the instructions. However, in this way, user interaction is completely different from that in the real application, then it would be hard testing the usability of the AR manual. This work aims to propose and compare two different solutions for the displaying of a demo version of an AR manual to support authoring in case of a product that is not physically available. We used as a case study that of an innovative semi-hermetic compressor that has not yet been produced. The applications were developed for a handheld device, using Unity 3D. The main issue was how to show the compressor and attach instructions on it. In one approach, we used Vuforia natural feature tracking to attach a CAD model of the compressor to a 2D image that is a drawing in scale 1:1 of the top-view of the CAD model. In this way, during the AR manual demonstration, the 3D model of the compressor is displayed on the user's device in place of the real compressor, and all the virtual instructions are attached to it. In the other approach, we first created a support application that shows the CAD model of the compressor on a marker. Then, we registered a video of this application, moving around the marker, obtaining a video that shows the CAD model from every point of view. For the AR manual, we used the Vuforia model target (360° option) to track the CAD model of the compressor, as it was the real compressor. Then, during the demonstration, the video is shown on a fixed large screen, and instructions are displayed attached to it in the AR manual. The first solution presents the main drawback to keeping the printed image with everyone working on the authoring of the AR manual, but allows to show the product in a real scale and interaction during the demonstration is very simple. The second one does not need a printed marker during the demonstration but a screen. Still, the compressor model is resized, and interaction is awkward since the user has to play the video on the screen to rotate the compressor. The two solutions were evaluated together with the company, and the preferred was the first one due to a more natural interaction.Keywords: augmented reality, human computer interaction, operating instructions, maintenance, assembly
Procedia PDF Downloads 12614468 Analysing Industry Clustering to Develop Competitive Advantage for Wualai Silver Handicraft
Authors: Khanita Tumphasuwan
Abstract:
The Wualai community of Northern Thailand represents important intellectual and social capital and their silver handicraft products are desirable tourist souvenirs within Chiang Mai Province. This community has been in danger of losing this social and intellectual capital due to the application of an improper tool, the Scottish Enterprise model of clustering. This research aims to analyze and increase its competitive advantages for preventing the loss of social and intellectual capital. To improve the Wualai’s competitive advantage, analysis is undertaken using a Porterian cluster approach, including the diamond model, five forces model and cluster mapping. Research results suggest that utilizing the community’s Buddhist beliefs can foster collaboration between community members and is the only way to improve cluster effectiveness, increase competitive advantage, and in turn conserve the Wualai community.Keywords: industry clustering, silver handicraft, competitive advantage, intellectual capital, social capital
Procedia PDF Downloads 56614467 Investigation and Comprehensive Benefit Analysis of 11 Typical Polar-Based Agroforestry Models Based on Analytic Hierarchy Process in Anhui Province, Eastern China
Authors: Zhihua Cao, Hongfei Zhao, Zhongneng Wu
Abstract:
The development of polar-based agroforestry was necessary due to the influence of the timber market environment in China, which can promote the coordinated development of forestry and agriculture, and gain remarkable ecological, economic and social benefits. The main agroforestry models of the main poplar planting area in Huaibei plain and along the Yangtze River plain were carried out. 11 typical management models of poplar were selected to sum up: pure poplar forest, poplar-rape-soybean, poplar-wheat-soybean, poplar-rape-cotton, poplar-wheat, poplar-chicken, poplar-duck, poplar-sheep, poplar-Agaricus blazei, poplar-oil peony, poplar-fish, represented by M0-M10, respectively. 12 indexes related with economic, ecological and social benefits (annual average cost, net income, ratio of output to investment, payback period of investment, land utilization ratio, utilization ratio of light energy, improvement and system stability of ecological and production environment, product richness, labor capacity, cultural quality of labor force, sustainability) were screened out to carry on the comprehensive evaluation and analysis to 11 kinds of typical agroforestry models based on analytic hierarchy process (AHP). The results showed that the economic benefit of each agroforestry model was in the order of: M8 > M6 > M9 > M7 > M5 > M10 > M4 > M1 > M2 > M3 > M0. The economic benefit of poplar-A. blazei model was the highest (332, 800 RMB / hm²), followed by poplar-duck and poplar-oil peony model (109, 820RMB /hm², 5, 7226 RMB /hm²). The order of comprehensive benefit was: M8 > M4 > M9 > M6 > M1 > M2 > M3 > M7 > M5 > M10 > M0. The economic benefit and comprehensive benefit of each agroforestry model were higher than that of pure poplar forest. The comprehensive benefit of poplar-A. blazei model was the highest, and that of poplar-wheat model ranked second, while its economic benefit was not high. Next were poplar-oil peony and poplar-duck models. It was suggested that the model of poplar-wheat should be adopted in the plain along the Yangtze River, and the whole cycle mode of poplar-grain, popalr-A. blazei, or poplar-oil peony should be adopted in Huaibei plain, northern Anhui. Furthermore, wheat, rape, and soybean are the main crops before the stand was closed; the agroforestry model of edible fungus or Chinese herbal medicine can be carried out when the stand was closed in order to maximize the comprehensive benefit. The purpose of this paper is to provide a reference for forest farmers in the selection of poplar agroforestry model in the future and to provide the basic data for the sustainable and efficient study of poplar agroforestry in Anhui province, eastern China.Keywords: agroforestry, analytic hierarchy process (AHP), comprehensive benefit, model, poplar
Procedia PDF Downloads 16314466 Talent-Priority: Exploring the Human Resource Reengineering Model in Digital Transformation of a Benchmark Company
Authors: Hsiu Hua Hu
Abstract:
Digital transformation has widely affected various industries. It provides technological innovation, process redesign, new business model construction, and talent value creation. This transformation not only allows organizations to obtain and deploy specific technologies and methods suitable for organizational reengineering but also is an important way to solve management problems in human resource (HR) reengineering, business efficiency, and process redesign. In this study, we present the results of a qualitative study that offers insight into a series of key feature of reengineering related to the digital transformation and how to create talent value when the companies successfully perform digital transformation and human resource reengineering, which is led by business digitalization strategies including talent planning, talent acquisition, talent adjustment, and talent development. Drawing from the qualitative investigation findings, we built an inductive model of HR reengineering, which aims to provide research and practical references on future digital transformation and management inquiry.Keywords: talent value creation, digital transformation, HR reengineering, qualitative study
Procedia PDF Downloads 15414465 Method for Tuning Level Control Loops Based on Internal Model Control and Closed Loop Step Test Data
Authors: Arnaud Nougues
Abstract:
This paper describes a two-stage methodology derived from internal model control (IMC) for tuning a proportional-integral-derivative (PID) controller for levels or other integrating processes in an industrial environment. Focus is the ease of use and implementation speed which are critical for an industrial application. Tuning can be done with minimum effort and without the need for time-consuming open-loop step tests on the plant. The first stage of the method applies to levels only: the vessel residence time is calculated from equipment dimensions and used to derive a set of preliminary proportional-integral (PI) settings with IMC. The second stage, re-tuning in closed-loop, applies to levels as well as other integrating processes: a tuning correction mechanism has been developed based on a series of closed-loop simulations with model errors. The tuning correction is done from a simple closed-loop step test and the application of a generic correlation between observed overshoot and integral time correction. A spin-off of the method is that an estimate of the vessel residence time (levels) or open-loop process gain (other integrating process) is obtained from the closed-loop data.Keywords: closed-loop model identification, IMC-PID tuning method, integrating process control, on-line PID tuning adaptation
Procedia PDF Downloads 21814464 Evaluating the Validity of CFD Model of Dispersion in a Complex Urban Geometry Using Two Sets of Experimental Measurements
Authors: Mohammad R. Kavian Nezhad, Carlos F. Lange, Brian A. Fleck
Abstract:
This research presents the validation study of a computational fluid dynamics (CFD) model developed to simulate the scalar dispersion emitted from rooftop sources around the buildings at the University of Alberta North Campus. The ANSYS CFX code was used to perform the numerical simulation of the wind regime and pollutant dispersion by solving the 3D steady Reynolds-averaged Navier-Stokes (RANS) equations on a building-scale high-resolution grid. The validation study was performed in two steps. First, the CFD model performance in 24 cases (eight wind directions and three wind speeds) was evaluated by comparing the predicted flow fields with the available data from the previous measurement campaign designed at the North Campus, using the standard deviation method (SDM), while the estimated results of the numerical model showed maximum average percent errors of approximately 53% and 37% for wind incidents from the North and Northwest, respectively. Good agreement with the measurements was observed for the other six directions, with an average error of less than 30%. In the second step, the reliability of the implemented turbulence model, numerical algorithm, modeling techniques, and the grid generation scheme was further evaluated using the Mock Urban Setting Test (MUST) dispersion dataset. Different statistical measures, including the fractional bias (FB), the geometric mean bias (MG), and the normalized mean square error (NMSE), were used to assess the accuracy of the predicted dispersion field. Our CFD results are in very good agreement with the field measurements.Keywords: CFD, plume dispersion, complex urban geometry, validation study, wind flow
Procedia PDF Downloads 13314463 Identity Verification Based on Multimodal Machine Learning on Red Green Blue (RGB) Red Green Blue-Depth (RGB-D) Voice Data
Authors: LuoJiaoyang, Yu Hongyang
Abstract:
In this paper, we experimented with a new approach to multimodal identification using RGB, RGB-D and voice data. The multimodal combination of RGB and voice data has been applied in tasks such as emotion recognition and has shown good results and stability, and it is also the same in identity recognition tasks. We believe that the data of different modalities can enhance the effect of the model through mutual reinforcement. We try to increase the three modalities on the basis of the dual modalities and try to improve the effectiveness of the network by increasing the number of modalities. We also implemented the single-modal identification system separately, tested the data of these different modalities under clean and noisy conditions, and compared the performance with the multimodal model. In the process of designing the multimodal model, we tried a variety of different fusion strategies and finally chose the fusion method with the best performance. The experimental results show that the performance of the multimodal system is better than that of the single modality, especially in dealing with noise, and the multimodal system can achieve an average improvement of 5%.Keywords: multimodal, three modalities, RGB-D, identity verification
Procedia PDF Downloads 6814462 Breast Cancer Incidence Estimation in Castilla-La Mancha (CLM) from Mortality and Survival Data
Authors: C. Romero, R. Ortega, P. Sánchez-Camacho, P. Aguilar, V. Segur, J. Ruiz, G. Gutiérrez
Abstract:
Introduction: Breast cancer is a leading cause of death in CLM. (2.8% of all deaths in women and 13,8% of deaths from tumors in womens). It is the most tumor incidence in CLM region with 26.1% from all tumours, except nonmelanoma skin (Cancer Incidence in Five Continents, Volume X, IARC). Cancer registries are a good information source to estimate cancer incidence, however the data are usually available with a lag which makes difficult their use for health managers. By contrast, mortality and survival statistics have less delay. In order to serve for resource planning and responding to this problem, a method is presented to estimate the incidence of mortality and survival data. Objectives: To estimate the incidence of breast cancer by age group in CLM in the period 1991-2013. Comparing the data obtained from the model with current incidence data. Sources: Annual number of women by single ages (National Statistics Institute). Annual number of deaths by all causes and breast cancer. (Mortality Registry CLM). The Breast cancer relative survival probability. (EUROCARE, Spanish registries data). Methods: A Weibull Parametric survival model from EUROCARE data is obtained. From the model of survival, the population and population data, Mortality and Incidence Analysis MODel (MIAMOD) regression model is obtained to estimate the incidence of cancer by age (1991-2013). Results: The resulting model is: Ix,t = Logit [const + age1*x + age2*x2 + coh1*(t – x) + coh2*(t-x)2] Where: Ix,t is the incidence at age x in the period (year) t; the value of the parameter estimates is: const (constant term in the model) = -7.03; age1 = 3.31; age2 = -1.10; coh1 = 0.61 and coh2 = -0.12. It is estimated that in 1991 were diagnosed in CLM 662 cases of breast cancer (81.51 per 100,000 women). An estimated 1,152 cases (112.41 per 100,000 women) were diagnosed in 2013, representing an increase of 40.7% in gross incidence rate (1.9% per year). The annual average increases in incidence by age were: 2.07% in women aged 25-44 years, 1.01% (45-54 years), 1.11% (55-64 years) and 1.24% (65-74 years). Cancer registries in Spain that send data to IARC declared 2003-2007 the average annual incidence rate of 98.6 cases per 100,000 women. Our model can obtain an incidence of 100.7 cases per 100,000 women. Conclusions: A sharp and steady increase in the incidence of breast cancer in the period 1991-2013 is observed. The increase was seen in all age groups considered, although it seems more pronounced in young women (25-44 years). With this method you can get a good estimation of the incidence.Keywords: breast cancer, incidence, cancer registries, castilla-la mancha
Procedia PDF Downloads 31014461 A Deterministic Large Deviation Model Based on Complex N-Body Systems
Authors: David C. Ni
Abstract:
In the previous efforts, we constructed N-Body Systems by an extended Blaschke product (EBP), which represents a non-temporal and nonlinear extension of Lorentz transformation. In this construction, we rely only on two parameters, nonlinear degree, and relative momentum to characterize the systems. We further explored root computation via iteration with an algorithm extended from Jenkins-Traub method. The solution sets demonstrate a form of σ+ i [-t, t], where σ and t are the real numbers, and the [-t, t] shows various canonical distributions. In this paper, we correlate the convergent sets in the original domain with solution sets, which demonstrating large-deviation distributions in the codomain. We proceed to compare our approach with the formula or principles, such as Donsker-Varadhan and Wentzell-Freidlin theories. The deterministic model based on this construction allows us to explore applications in the areas of finance and statistical mechanics.Keywords: nonlinear Lorentz transformation, Blaschke equation, iteration solutions, root computation, large deviation distribution, deterministic model
Procedia PDF Downloads 39214460 Co-Gasification of Petroleum Waste and Waste Tires: A Numerical and CFD Study
Authors: Thomas Arink, Isam Janajreh
Abstract:
The petroleum industry generates significant amounts of waste in the form of drill cuttings, contaminated soil and oily sludge. Drill cuttings are a product of the off-shore drilling rigs, containing wet soil and total petroleum hydrocarbons (TPH). Contaminated soil comes from different on-shore sites and also contains TPH. The oily sludge is mainly residue or tank bottom sludge from storage tanks. The two main treatment methods currently used are incineration and thermal desorption (TD). Thermal desorption is a method where the waste material is heated to 450ºC in an anaerobic environment to release volatiles, the condensed volatiles can be used as a liquid fuel. For the thermal desorption unit dry contaminated soil is mixed with moist drill cuttings to generate a suitable mixture. By thermo gravimetric analysis (TGA) of the TD feedstock it was found that less than 50% of the TPH are released, the discharged material is stored in landfill. This study proposes co-gasification of petroleum waste with waste tires as an alternative to thermal desorption. Co-gasification with a high-calorific material is necessary since the petroleum waste consists of more than 60 wt% ash (soil/sand), causing its calorific value to be too low for gasification. Since the gasification process occurs at 900ºC and higher, close to 100% of the TPH can be released, according to the TGA. This work consists of three parts: 1. a mathematical gasification model, 2. a reactive flow CFD model and 3. experimental work on a drop tube reactor. Extensive material characterization was done by means of proximate analysis (TGA), ultimate analysis (CHNOS flash analysis) and calorific value measurements (Bomb calorimeter) for the input parameters of the mathematical and CFD model. The mathematical model is a zero dimensional model based on Gibbs energy minimization together with Lagrange multiplier; it is used to find the product species composition (molar fractions of CO, H2, CH4 etc.) for different tire/petroleum feedstock mixtures and equivalence ratios. The results of the mathematical model act as a reference for the CFD model of the drop-tube reactor. With the CFD model the efficiency and product species composition can be predicted for different mixtures and particle sizes. Finally both models are verified by experiments on a drop tube reactor (1540 mm long, 66 mm inner diameter, 1400 K maximum temperature).Keywords: computational fluid dynamics (CFD), drop tube reactor, gasification, Gibbs energy minimization, petroleum waste, waste tires
Procedia PDF Downloads 51914459 Nonlinear Finite Element Modeling of Unbonded Steel Reinforced Concrete Beams
Authors: Fares Jnaid, Riyad Aboutaha
Abstract:
In this paper, a nonlinear Finite Element Analysis (FEA) was carried out using ANSYS software to build a model able of predicting the behavior of Reinforced Concrete (RC) beams with unbonded reinforcement. The FEA model was compared to existing experimental data by other researchers. The existing experimental data consisted of 16 beams that varied from structurally sound beams to beams with unbonded reinforcement with different unbonded lengths and reinforcement ratios. The model was able to predict the ultimate flexural strength, load-deflection curve, and crack pattern of concrete beams with unbonded reinforcement. It was concluded that when the when the unbonded length is less than 45% of the span, there will be no decrease in the ultimate flexural strength due to the loss of bond between the steel reinforcement and the surrounding concrete regardless of the reinforcement ratio. Moreover, when the reinforcement ratio is relatively low, there will be no decrease in ultimate flexural strength regardless of the length of unbond.Keywords: FEA, ANSYS, unbond, strain
Procedia PDF Downloads 25114458 Application of the Total Least Squares Estimation Method for an Aircraft Aerodynamic Model Identification
Authors: Zaouche Mohamed, Amini Mohamed, Foughali Khaled, Aitkaid Souhila, Bouchiha Nihad Sarah
Abstract:
The aerodynamic coefficients are important in the evaluation of an aircraft performance and stability-control characteristics. These coefficients also can be used in the automatic flight control systems and mathematical model of flight simulator. The study of the aerodynamic aspect of flying systems is a reserved domain and inaccessible for the developers. Doing tests in a wind tunnel to extract aerodynamic forces and moments requires a specific and expensive means. Besides, the glaring lack of published documentation in this field of study makes the aerodynamic coefficients determination complicated. This work is devoted to the identification of an aerodynamic model, by using an aircraft in virtual simulated environment. We deal with the identification of the system, we present an environment framework based on Software In the Loop (SIL) methodology and we use MicrosoftTM Flight Simulator (FS-2004) as the environment for plane simulation. We propose The Total Least Squares Estimation technique (TLSE) to identify the aerodynamic parameters, which are unknown, variable, classified and used in the expression of the piloting law. In this paper, we define each aerodynamic coefficient as the mean of its numerical values. All other variations are considered as modeling uncertainties that will be compensated by the robustness of the piloting control.Keywords: aircraft aerodynamic model, total least squares estimation, piloting the aircraft, robust control, Microsoft Flight Simulator, MQ-1 predator
Procedia PDF Downloads 28514457 3D Printing Perceptual Models of Preference Using a Fuzzy Extreme Learning Machine Approach
Authors: Xinyi Le
Abstract:
In this paper, 3D printing orientations were determined through our perceptual model. Some FDM (Fused Deposition Modeling) 3D printers, which are widely used in universities and industries, often require support structures during the additive manufacturing. After removing the residual material, some surface artifacts remain at the contact points. These artifacts will damage the function and visual effect of the model. To prevent the impact of these artifacts, we present a fuzzy extreme learning machine approach to find printing directions that avoid placing supports in perceptually significant regions. The proposed approach is able to solve the evaluation problem by combing both the subjective knowledge and objective information. Our method combines the advantages of fuzzy theory, auto-encoders, and extreme learning machine. Fuzzy set theory is applied for dealing with subjective preference information, and auto-encoder step is used to extract good features without supervised labels before extreme learning machine. An extreme learning machine method is then developed successfully for training and learning perceptual models. The performance of this perceptual model will be demonstrated on both natural and man-made objects. It is a good human-computer interaction practice which draws from supporting knowledge on both the machine side and the human side.Keywords: 3d printing, perceptual model, fuzzy evaluation, data-driven approach
Procedia PDF Downloads 43814456 Double Layer Security Model for Identification Friend or Foe
Authors: Buse T. Aydın, Enver Ozdemir
Abstract:
In this study, a double layer authentication scheme between the aircraft and the Air Traffic Control (ATC) tower is designed to prevent any unauthorized aircraft from introducing themselves as friends. The method is a combination of classical cryptographic methods and new generation physical layers. The first layer has employed the embedded key of the aircraft. The embedded key is assumed to installed during the construction of the utility. The other layer is a physical attribute (flight path, distance, etc.) between the aircraft and the ATC tower. We create a mathematical model so that two layers’ information is employed and an aircraft is authenticated as a friend or foe according to the accuracy of the results of the model. The results of the aircraft are compared with the results of the ATC tower and if the values found by the aircraft and ATC tower match within a certain error margin, we mark the aircraft as a friend. In this method, even if embedded key is captured by the enemy aircraft, without the information of the second layer, the enemy can easily be determined. Overall, in this work, we present a more reliable system by adding a physical layer in the authentication process.Keywords: ADS-B, communication with physical layer security, cryptography, identification friend or foe
Procedia PDF Downloads 15914455 Basics of Gamma Ray Burst and Its Afterglow
Authors: Swapnil Kumar Singh
Abstract:
Gamma-ray bursts (GRB's), short and intense pulses of low-energy γ rays, have fascinated astronomers and astrophysicists since their unexpected discovery in the late sixties. GRB'sare accompanied by long-lasting afterglows, and they are associated with core-collapse supernovae. The detection of delayed emission in X-ray, optical, and radio wavelength, or "afterglow," following a γ-ray burst can be described as the emission of a relativistic shell decelerating upon collision with the interstellar medium. While it is fair to say that there is strong diversity amongst the afterglow population, probably reflecting diversity in the energy, luminosity, shock efficiency, baryon loading, progenitor properties, circumstellar medium, and more, the afterglows of GRBs do appear more similar than the bursts themselves, and it is possible to identify common features within afterglows that lead to some canonical expectations. After an initial flash of gamma rays, a longer-lived "afterglow" is usually emitted at longer wavelengths (X-ray, ultraviolet, optical, infrared, microwave, and radio). It is a slowly fading emission at longer wavelengths created by collisions between the burst ejecta and interstellar gas. In X-ray wavelengths, the GRB afterglow fades quickly at first, then transitions to a less-steep drop-off (it does other stuff after that, but we'll ignore that for now). During these early phases, the X-ray afterglow has a spectrum that looks like a power law: flux F∝ E^β, where E is energy and beta is some number called the spectral index. This kind of spectrum is characteristic of synchrotron emission, which is produced when charged particles spiral around magnetic field lines at close to the speed of light. In addition to the outgoing forward shock that ploughs into the interstellar medium, there is also a so-called reverse shock, which propagates backward through the ejecta. In many ways," reverse" shock can be misleading; this shock is still moving outward from the restframe of the star at relativistic velocity but is ploughing backward through the ejecta in their frame and is slowing the expansion. This reverse shock can be dynamically important, as it can carry comparable energy to the forward shock. The early phases of the GRB afterglow still provide a good description even if the GRB is highly collimated since the individual emitting regions of the outflow are not in causal contact at large angles and so behave as though they are expanding isotropically. The majority of afterglows, at times typically observed, fall in the slow cooling regime, and the cooling break lies between the optical and the X-ray. Numerous observations support this broad picture for afterglows in the spectral energy distribution of the afterglow of the very bright GRB. The bluer light (optical and X-ray) appears to follow a typical synchrotron forward shock expectation (note that the apparent features in the X-ray and optical spectrum are due to the presence of dust within the host galaxy). We need more research in GRB and Particle Physics in order to unfold the mysteries of afterglow.Keywords: GRB, synchrotron, X-ray, isotropic energy
Procedia PDF Downloads 8714454 Hybrid Fuzzy Weighted K-Nearest Neighbor to Predict Hospital Readmission for Diabetic Patients
Authors: Soha A. Bahanshal, Byung G. Kim
Abstract:
Identification of patients at high risk for hospital readmission is of crucial importance for quality health care and cost reduction. Predicting hospital readmissions among diabetic patients has been of great interest to many researchers and health decision makers. We build a prediction model to predict hospital readmission for diabetic patients within 30 days of discharge. The core of the prediction model is a modified k Nearest Neighbor called Hybrid Fuzzy Weighted k Nearest Neighbor algorithm. The prediction is performed on a patient dataset which consists of more than 70,000 patients with 50 attributes. We applied data preprocessing using different techniques in order to handle data imbalance and to fuzzify the data to suit the prediction algorithm. The model so far achieved classification accuracy of 80% compared to other models that only use k Nearest Neighbor.Keywords: machine learning, prediction, classification, hybrid fuzzy weighted k-nearest neighbor, diabetic hospital readmission
Procedia PDF Downloads 18414453 Integrated Genetic-A* Graph Search Algorithm Decision Model for Evaluating Cost and Quality of School Renovation Strategies
Authors: Yu-Ching Cheng, Yi-Kai Juan, Daniel Castro
Abstract:
Energy consumption of buildings has been an increasing concern for researchers and practitioners in the last decade. Sustainable building renovation can reduce energy consumption and carbon dioxide emissions; meanwhile, it also can extend existing buildings useful life and facilitate environmental sustainability while providing social and economic benefits to the society. School buildings are different from other designed spaces as they are more crowded and host the largest portion of daily activities and occupants. Strategies that focus on reducing energy use but also improve the students’ learning environment becomes a significant subject in sustainable school buildings development. A decision model is developed in this study to solve complicated and large-scale combinational, discrete and determinate problems such as school renovation projects. The task of this model is to automatically search for the most cost-effective (lower cost and higher quality) renovation strategies. In this study, the search process of optimal school building renovation solutions is by nature a large-scale zero-one programming determinate problem. A* is suitable for solving deterministic problems due to its stable and effective search process, and genetic algorithms (GA) provides opportunities to acquire global optimal solutions in a short time via its indeterminate search process based on probability. These two algorithms are combined in this study to consider trade-offs between renovation cost and improved quality, this decision model is able to evaluate current school environmental conditions and suggest an optimal scheme of sustainable school buildings renovation strategies. Through adoption of this decision model, school managers can overcome existing limitations and transform school buildings into spaces more beneficial to students and friendly to the environment.Keywords: decision model, school buildings, sustainable renovation, genetic algorithm, A* search algorithm
Procedia PDF Downloads 11714452 Assessment of Mountain Hydrological Processes in the Gumera Catchment, Ethiopia
Authors: Tewele Gebretsadkan Haile
Abstract:
Mountain terrains are essential to regional water resources by regulating hydrological processes that use downstream water supplies. Nevertheless, limited observed earth data in complex topography poses challenges for water resources regulation. That's why satellite product is implemented in this study. This study evaluates hydrological processes on mountain catchment of Gumera, Ethiopia using HBV-light model with satellite precipitation products (CHIRPS) for the temporal scale of 1996 to 2010 and area coverage of 1289 km2. The catchment is characterized by cultivation dominant and elevation ranges from 1788 to 3606 m above sea level. Three meteorological stations have been used for downscaling of the satellite data and one stream flow for calibration and validation. The result shows total annual water balance showed that precipitation 1410 mm, simulated 828 mm surface runoff compared to 1042 mm observed stream flow with actual evapotranspiration estimate 586mm and 1495mm potential evapotranspiration. The temperature range is 9°C in winter to 21°C. The catchment contributes 74% as quack runoff to the total runoff and 26% as lower groundwater storage, which sustains stream flow during low periods. The model uncertainty was measured using different metrics such as coefficient of determination, model efficiency, efficiency for log(Q) and flow weighted efficiency 0.76, 0.74, 0.66 and 0.70 respectively. The research result highlights that HBV model captures the mountain hydrology simulation and the result indicates quack runoff due to the traditional agricultural system, slope factor of the topography and adaptation measure for water resource management is recommended.Keywords: mountain hydrology, CHIRPS, Gumera, HBV model
Procedia PDF Downloads 914451 Quantification of the Gumera Catchment's Mountain Hydrological Processes in Ethiopia
Authors: Tewele Gebretsadkan Haile
Abstract:
Mountain terrains are essential to regional water resources by regulating hydrological processes that use downstream water supplies. Nevertheless, limited observed earth data in complex topography poses challenges for water resources regulation. That's why satellite product is implemented in this study. This study evaluates hydrological processes on mountain catchment of Gumera, Ethiopia using HBV-light model with satellite precipitation products (CHIRPS) for the temporal scale of 1996 to 2010 and area coverage of 1289 km2. The catchment is characterized by cultivation dominant and elevation ranges from 1788 to 3606 m above sea level. Three meteorological stations have been used for downscaling of the satellite data and one stream flow for calibration and validation. The result shows total annual water balance showed that precipitation 1410 mm, simulated 828 mm surface runoff compared to 1042 mm observed stream flow with actual evapotranspiration estimate 586mm and 1495mm potential evapotranspiration. The temperature range is 9°C in winter to 21°C. The catchment contributes 74% as quack runoff to the total runoff and 26% as lower groundwater storage, which sustains stream flow during low periods. The model uncertainty was measured using different metrics such as coefficient of determination, model efficiency, efficiency for log(Q) and flow weighted efficiency 0.76, 0.74, 0.66 and 0.70 respectively. The research result highlights that HBV model captures the mountain hydrology simulation and the result indicates quack runoff due to the traditional agricultural system, slope factor of the topography and adaptation measure for water resource management is recommended.Keywords: mountain hydrology, CHIRPS, HBV model, Gumera
Procedia PDF Downloads 614450 Experimental Study to Determine the Effect of Wire Mesh Pore Size on Natural Draft Chimney Performance
Authors: Md. Mizanur Rahman, Chu Chi Ming, Mohd Suffian Bin Misaran
Abstract:
Chimney is an important part of the industries to remove waste heat from the processes side to the atmosphere. The increased demand of energy helps to restart to think about the efficiency of chimney as well as to find out a valid option to replace forced draft chimney system from industries. In this study natural draft chimney model is air flow rate; exit air temperature and pressure losses are studied through modification with wire mesh screen and compare the results with without wire mesh screen chimney model. The heat load is varies from 0.1 kW to 1kW and three different wire mesh screens that have pore size 0.15 mm2, 0.40 mm2 and 4.0 mm2 respectively are used. The experimental results show that natural draft chimney model with wire mesh screens significantly restored the flow losses compared to the system without wire mesh screen. The natural draft chimney model with 0.40 mm2 pore size wire mesh screen can minimize the draft losses better than others and able to enhance velocity about 54 % exit air temperature about 41% and pressure loss decreased by about 20%. Therefore, it can be decided that the wire mesh screens significantly minimize the draft losses in the natural draft chimney and 0.40 mm2 pore size screen will be a suitable option.Keywords: natural draft dhimney, wire mesh screen, natural draft flow, mechanical engineering
Procedia PDF Downloads 31714449 Building Biodiversity Conservation Plans Robust to Human Land Use Uncertainty
Authors: Yingxiao Ye, Christopher Doehring, Angelos Georghiou, Hugh Robinson, Phebe Vayanos
Abstract:
Human development is a threat to biodiversity, and conservation organizations (COs) are purchasing land to protect areas for biodiversity preservation. However, COs have limited budgets and thus face hard prioritization decisions that are confounded by uncertainty in future human land use. This research proposes a data-driven sequential planning model to help COs choose land parcels that minimize the uncertain human impact on biodiversity. The proposed model is robust to uncertain development, and the sequential decision-making process is adaptive, allowing land purchase decisions to adapt to human land use as it unfolds. The cellular automata model is leveraged to simulate land use development based on climate data, land characteristics, and development threat index from NASA Socioeconomic Data and Applications Center. This simulation is used to model uncertainty in the problem. This research leverages state-of-the-art techniques in the robust optimization literature to propose a computationally tractable reformulation of the model, which can be solved routinely by off-the-shelf solvers like Gurobi or CPLEX. Numerical results based on real data from the Jaguar in Central and South America show that the proposed method reduces conservation loss by 19.46% on average compared to standard approaches such as MARXAN used in practice for biodiversity conservation. Our method may better help guide the decision process in land acquisition and thereby allow conservation organizations to maximize the impact of limited resources.Keywords: data-driven robust optimization, biodiversity conservation, uncertainty simulation, adaptive sequential planning
Procedia PDF Downloads 208