Search results for: Weibull distribution model
13866 Application of Griddization Management to Construction Hazard Management
Authors: Lingzhi Li, Jiankun Zhang, Tiantian Gu
Abstract:
Hazard management that can prevent fatal accidents and property losses is a fundamental process during the buildings’ construction stage. However, due to lack of safety supervision resources and operational pressures, the conduction of hazard management is poor and ineffective in China. In order to improve the quality of construction safety management, it is critical to explore the use of information technologies to ensure that the process of hazard management is efficient and effective. After exploring the existing problems of construction hazard management in China, this paper develops the griddization management model for construction hazard management. First, following the knowledge grid infrastructure, the griddization computing infrastructure for construction hazards management is designed which includes five layers: resource entity layer, information management layer, task management layer, knowledge transformation layer and application layer. This infrastructure will be as the technical support for realizing grid management. Second, this study divides the construction hazards into grids through city level, district level and construction site level according to grid principles. Last, a griddization management process including hazard identification, assessment and control is developed. Meanwhile, all stakeholders of construction safety management, such as owners, contractors, supervision organizations and government departments, should take the corresponding responsibilities in this process. Finally, a case study based on actual construction hazard identification, assessment and control is used to validate the effectiveness and efficiency of the proposed griddization management model. The advantage of this designed model is to realize information sharing and cooperative management between various safety management departments.Keywords: construction hazard, griddization computing, grid management, process
Procedia PDF Downloads 27513865 A Probability Analysis of Construction Project Schedule Using Risk Management Tool
Authors: A. L. Agarwal, D. A. Mahajan
Abstract:
Construction industry tumbled along with other industry/sectors during recent economic crash. Construction business could not regain thereafter and still pass through slowdown phase, resulted many real estate as well as infrastructure projects not completed on schedule and within budget. There are many theories, tools, techniques with software packages available in the market to analyze construction schedule. This study focuses on the construction project schedule and uncertainties associated with construction activities. The infrastructure construction project has been considered for the analysis of uncertainty on project activities affecting project duration and analysis is done using @RISK software. Different simulation results arising from three probability distribution functions are compiled to benefit construction project managers to plan more realistic schedule of various construction activities as well as project completion to document in the contract and avoid compensations or claims arising out of missing the planned schedule.Keywords: construction project, distributions, project schedule, uncertainty
Procedia PDF Downloads 35013864 A Comparative Study of Various Control Methods for Rendezvous of a Satellite Couple
Authors: Hasan Basaran, Emre Unal
Abstract:
Formation flying of satellites is a mission that involves a relative position keeping of different satellites in the constellation. In this study, different control algorithms are compared with one another in terms of ΔV, velocity increment, and tracking error. Various control methods, covering continuous and impulsive approaches are implemented and tested for satellites flying in low Earth orbit. Feedback linearization, sliding mode control, and model predictive control are designed and compared with an impulsive feedback law, which is based on mean orbital elements. Feedback linearization and sliding mode control approaches have identical mathematical models that include second order Earth oblateness effects. The model predictive control, on the other hand, does not include any perturbations and assumes circular chief orbit. The comparison is done with 4 different initial errors and achieved with velocity increment, root mean square error, maximum steady state error, and settling time. It was observed that impulsive law consumed the least ΔV, while produced the highest maximum error in the steady state. The continuous control laws, however, consumed higher velocity increments and produced lower amounts of tracking errors. Finally, the inversely proportional relationship between tracking error and velocity increment was established.Keywords: chief-deputy satellites, feedback linearization, follower-leader satellites, formation flight, fuel consumption, model predictive control, rendezvous, sliding mode
Procedia PDF Downloads 10513863 The Importance of including All Data in a Linear Model for the Analysis of RNAseq Data
Authors: Roxane A. Legaie, Kjiana E. Schwab, Caroline E. Gargett
Abstract:
Studies looking at the changes in gene expression from RNAseq data often make use of linear models. It is also common practice to focus on a subset of data for a comparison of interest, leaving aside the samples not involved in this particular comparison. This work shows the importance of including all observations in the modeling process to better estimate variance parameters, even when the samples included are not directly used in the comparison under test. The human endometrium is a dynamic tissue, which undergoes cycles of growth and regression with each menstrual cycle. The mesenchymal stem cells (MSCs) present in the endometrium are likely responsible for this remarkable regenerative capacity. However recent studies suggest that MSCs also plays a role in the pathogenesis of endometriosis, one of the most common medical conditions affecting the lower abdomen in women in which the endometrial tissue grows outside the womb. In this study we compared gene expression profiles between MSCs and non-stem cell counterparts (‘non-MSC’) obtained from women with (‘E’) or without (‘noE’) endometriosis from RNAseq. Raw read counts were used for differential expression analysis using a linear model with the limma-voom R package, including either all samples in the study or only the samples belonging to the subset of interest (e.g. for the comparison ‘E vs noE in MSC cells’, including only MSC samples from E and noE patients but not the non-MSC ones). Using the full dataset we identified about 100 differentially expressed (DE) genes between E and noE samples in MSC samples (adj.p-val < 0.05 and |logFC|>1) while only 9 DE genes were identified when using only the subset of data (MSC samples only). Important genes known to be involved in endometriosis such as KLF9 and RND3 were missed in the latter case. When looking at the MSC vs non-MSC cells comparison, the linear model including all samples identified 260 genes for noE samples (including the stem cell marker SUSD2) while the subset analysis did not identify any DE genes. When looking at E samples, 12 genes were identified with the first approach and only 1 with the subset approach. Although the stem cell marker RGS5 was found in both cases, the subset test missed important genes involved in stem cell differentiation such as NOTCH3 and other potentially related genes to be used for further investigation and pathway analysis.Keywords: differential expression, endometriosis, linear model, RNAseq
Procedia PDF Downloads 43213862 Multiparametric Optimization of Water Treatment Process for Thermal Power Plants
Authors: Balgaisha Mukanova, Natalya Glazyrina, Sergey Glazyrin
Abstract:
The formulated problem of optimization of the technological process of water treatment for thermal power plants is considered in this article. The problem is of multiparametric nature. To optimize the process, namely, reduce the amount of waste water, a new technology was developed to reuse such water. A mathematical model of the technology of wastewater reuse was developed. Optimization parameters were determined. The model consists of a material balance equation, an equation describing the kinetics of ion exchange for the non-equilibrium case and an equation for the ion exchange isotherm. The material balance equation includes a nonlinear term that depends on the kinetics of ion exchange. A direct problem of calculating the impurity concentration at the outlet of the water treatment plant was numerically solved. The direct problem was approximated by an implicit point-to-point computation difference scheme. The inverse problem was formulated as relates to determination of the parameters of the mathematical model of the water treatment plant operating in non-equilibrium conditions. The formulated inverse problem was solved. Following the results of calculation the time of start of the filter regeneration process was determined, as well as the period of regeneration process and the amount of regeneration and wash water. Multi-parameter optimization of water treatment process for thermal power plants allowed decreasing the amount of wastewater by 15%.Keywords: direct problem, multiparametric optimization, optimization parameters, water treatment
Procedia PDF Downloads 38713861 Preparation and Characterization of Nanometric Ni-Zn Ferrite via Different Methods
Authors: Ebtesam. E. Ateia, L. M. Salah, A. H. El-Bassuony
Abstract:
The aim of the presented study was the possibility of developing a nanosized material with enhanced structural properties that was suitable for many applications. Nanostructure ferrite of composition Ni0.5 Zn0.5 Cr0.1 Fe1.9 O4 were prepared by sol–gel, co-precipitation, citrate-gel, flash and oxalate precursor methods. The Structural and micro structural analysis of the investigated samples were carried out. It was observed that the lattice parameter of cubic spinel was constant, and the positions of both tetrahedral and the octahedral bands had a fixed position. The values of the lattice parameter had a significant role in determining the stoichiometric cation distribution of the composition.The average crystalline sizes of the investigated samples were from 16.4 to 69 nm. Discussion was made on the basis of a comparison of average crystallite size of the investigated samples, indicating that the co-precipitation method was the the effective one in producing small crystallite sized samples.Keywords: chemical preparation, ferrite, grain size, nanocomposites, sol-gel
Procedia PDF Downloads 34113860 A Communication Signal Recognition Algorithm Based on Holder Coefficient Characteristics
Authors: Hui Zhang, Ye Tian, Fang Ye, Ziming Guo
Abstract:
Communication signal modulation recognition technology is one of the key technologies in the field of modern information warfare. At present, communication signal automatic modulation recognition methods are mainly divided into two major categories. One is the maximum likelihood hypothesis testing method based on decision theory, the other is a statistical pattern recognition method based on feature extraction. Now, the most commonly used is a statistical pattern recognition method, which includes feature extraction and classifier design. With the increasingly complex electromagnetic environment of communications, how to effectively extract the features of various signals at low signal-to-noise ratio (SNR) is a hot topic for scholars in various countries. To solve this problem, this paper proposes a feature extraction algorithm for the communication signal based on the improved Holder cloud feature. And the extreme learning machine (ELM) is used which aims at the problem of the real-time in the modern warfare to classify the extracted features. The algorithm extracts the digital features of the improved cloud model without deterministic information in a low SNR environment, and uses the improved cloud model to obtain more stable Holder cloud features and the performance of the algorithm is improved. This algorithm addresses the problem that a simple feature extraction algorithm based on Holder coefficient feature is difficult to recognize at low SNR, and it also has a better recognition accuracy. The results of simulations show that the approach in this paper still has a good classification result at low SNR, even when the SNR is -15dB, the recognition accuracy still reaches 76%.Keywords: communication signal, feature extraction, Holder coefficient, improved cloud model
Procedia PDF Downloads 15613859 Signature Verification System for a Banking Business Process Management
Authors: A. Rahaf, S. Liyakathunsia
Abstract:
In today’s world, unprecedented operational pressure is faced by banks that test the efficiency, effectiveness, and agility of their business processes. In a typical banking process, a person’s authorization is usually based on his signature on most all of the transactions. Signature verification is considered as one of the highly significant information needed for any bank document processing. Banks usually use Signature Verification to authenticate the identity of individuals. In this paper, a business process model has been proposed in order to increase the quality of the verification process and to reduce time and needed resources. In order to understand the current process, a survey has been conducted and distributed among bank employees. After analyzing the survey, a process model has been created using Bizagi modeler which helps in simulating the process after assigning time and cost of it. The outcomes show that the automation of signature verification process is highly recommended for a banking business process.Keywords: business process management, process modeling, quality, Signature Verification
Procedia PDF Downloads 42813858 Manufacturing Process of S-Glass Fiber Reinforced PEKK Prepregs
Authors: Nassier A. Nassir, Robert Birch, Zhongwei Guan
Abstract:
The aim of this study is to investigate the fundamental science/technology related to novel S-glass fiber reinforced polyether- ketone-ketone (GF/PEKK) composites and to gain insight into bonding strength and failure mechanisms. Different manufacturing techniques to make this high-temperature pre-impregnated composite (prepreg) were conducted i.e. mechanical deposition, electrostatic powder deposition, and dry powder prepregging techniques. Generally, the results of this investigation showed that it was difficult to control the distribution of the resin powder evenly on the both sides of the fibers within a specific percentage. Most successful approach was by using a dry powder prepregging where the fibers were coated evenly with an adhesive that served as a temporary binder to hold the resin powder in place onto the glass fiber fabric.Keywords: sry powder technique, PEKK, S-glass, thermoplastic prepreg
Procedia PDF Downloads 20413857 Three-Dimensional Finite Element Analysis of Geogrid-Reinforced Piled Embankments on Soft Clay
Authors: Mahmoud Y. Shokry, Rami M. El-Sherbiny
Abstract:
This paper aims to highlight the role of some parameters that may be of a noticeable impact on numerical analysis/design of embankments. It presents the results of a three-dimensional (3-D) finite element analysis of a monitored earth embankment that was constructed on soft clay formation stabilized by cast in-situ piles using software PLAXIS 3D. A comparison between the predicted and the monitored responses is presented to assess the adequacy of the adopted numerical model. The model was used in the targeted parametric study. Moreover, a comparison was performed between the results of the 3-D analyses and the analytical solutions. This paper concluded that the effect of using mono pile caps led to decrease both the total and differential settlement and increased the efficiency of the piled embankment system. The study of using geogrids revealed that it can contribute in decreasing the settlement and maximizing the part of the embankment load transferred to piles. Moreover, it was found that increasing the stiffness of the geogrids provides higher values of tensile forces and hence has more effective influence on embankment load carried by piles rather than using multi-number of layers with low values of geogrid stiffness. The efficiency of the piled embankments system was also found to be greater when higher embankments are used rather than the low height embankments. The comparison between the numerical 3-D model and the theoretical design methods revealed that many analytical solutions are conservative and non-accurate rather than the 3-D finite element numerical models.Keywords: efficiency, embankment, geogrids, soft clay
Procedia PDF Downloads 32313856 Mobile Augmented Reality for Collaboration in Operation
Authors: Chong-Yang Qiao
Abstract:
Mobile augmented reality (MAR) tracking targets from the surroundings and aids operators for interactive data and procedures visualization, potential equipment and system understandably. Operators remotely communicate and coordinate with each other for the continuous tasks, information and data exchange between control room and work-site. In the routine work, distributed control system (DCS) monitoring and work-site manipulation require operators interact in real-time manners. The critical question is the improvement of user experience in cooperative works through applying Augmented Reality in the traditional industrial field. The purpose of this exploratory study is to find the cognitive model for the multiple task performance by MAR. In particular, the focus will be on the comparison between different tasks and environment factors which influence information processing. Three experiments use interface and interaction design, the content of start-up, maintenance and stop embedded in the mobile application. With the evaluation criteria of time demands and human errors, and analysis of the mental process and the behavior action during the multiple tasks, heuristic evaluation was used to find the operators performance with different situation factors, and record the information processing in recognition, interpretation, judgment and reasoning. The research will find the functional properties of MAR and constrain the development of the cognitive model. Conclusions can be drawn that suggest MAR is easy to use and useful for operators in the remote collaborative works.Keywords: mobile augmented reality, remote collaboration, user experience, cognition model
Procedia PDF Downloads 19713855 Rapid Soil Classification Using Computer Vision with Electrical Resistivity and Soil Strength
Authors: Eugene Y. J. Aw, J. W. Koh, S. H. Chew, K. E. Chua, P. L. Goh, Grace H. B. Foo, M. L. Leong
Abstract:
This paper presents the evaluation of various soil testing methods such as the four-probe soil electrical resistivity method and cone penetration test (CPT) that can complement a newly developed novel rapid soil classification scheme using computer vision, to improve the accuracy and productivity of on-site classification of excavated soil. In Singapore, excavated soils from the local construction industry are transported to Staging Grounds (SGs) to be reused as fill material for land reclamation. Excavated soils are mainly categorized into two groups (“Good Earth” and “Soft Clay”) based on particle size distribution (PSD) and water content (w) from soil investigation reports and on-site visual survey, such that proper treatment and usage can be exercised. However, this process is time-consuming and labor-intensive. Thus, a rapid classification method is needed at the SGs. Four-probe soil electrical resistivity and CPT were evaluated for their feasibility as suitable additions to the computer vision system to further develop this innovative non-destructive and instantaneous classification method. The computer vision technique comprises soil image acquisition using an industrial-grade camera; image processing and analysis via calculation of Grey Level Co-occurrence Matrix (GLCM) textural parameters; and decision-making using an Artificial Neural Network (ANN). It was found from the previous study that the ANN model coupled with ρ can classify soils into “Good Earth” and “Soft Clay” in less than a minute, with an accuracy of 85% based on selected representative soil images. To further improve the technique, the following three items were targeted to be added onto the computer vision scheme: the apparent electrical resistivity of soil (ρ) measured using a set of four probes arranged in Wenner’s array, the soil strength measured using a modified mini cone penetrometer, and w measured using a set of time-domain reflectometry (TDR) probes. Laboratory proof-of-concept was conducted through a series of seven tests with three types of soils – “Good Earth”, “Soft Clay,” and a mix of the two. Validation was performed against the PSD and w of each soil type obtained from conventional laboratory tests. The results show that ρ, w and CPT measurements can be collectively analyzed to classify soils into “Good Earth” or “Soft Clay” and are feasible as complementing methods to the computer vision system.Keywords: computer vision technique, cone penetration test, electrical resistivity, rapid and non-destructive, soil classification
Procedia PDF Downloads 23913854 Global Direct Search Optimization of a Tuned Liquid Column Damper Subject to Stochastic Load
Authors: Mansour H. Alkmim, Adriano T. Fabro, Marcus V. G. De Morais
Abstract:
In this paper, a global direct search optimization algorithm to reduce vibration of a tuned liquid column damper (TLCD), a class of passive structural control device, is presented. The objective is to find optimized parameters for the TLCD under stochastic load from different wind power spectral density. A verification is made considering the analytical solution of an undamped primary system under white noise excitation. Finally, a numerical example considering a simplified wind turbine model is given to illustrate the efficacy of the TLCD. Results from the random vibration analysis are shown for four types of random excitation wind model where the response PSDs obtained showed good vibration attenuation.Keywords: generalized pattern search, parameter optimization, random vibration analysis, vibration suppression
Procedia PDF Downloads 27513853 Designating and Evaluating a Healthy Eating Model at the Workplace: A Practical Strategy for Preventing Non-Communicable Diseases in Aging
Authors: Mahnaz Khalafehnilsaz, Rozina Rahnama
Abstract:
Introduction: The aging process has been linked to a wide range of non-communicable diseases that cause a loss of health-related quality of life. This process can be worsened if an active and healthy lifestyle is not followed by adults, especially in the workplace. This setting not only may create a sedentary lifestyle but will lead to obesity and overweight in the long term and create unhealthy and inactive aging. In addition, eating habits are always known to be associated with active aging. Therefore, it is very valuable to know the eating patterns of people at work in order to detect and prevent diseases in the coming years. This study aimed to design and test a model to improve eating habits among employees at an industrial complex as a practical strategy. Material and method: The present research was a mixed-method study with a subsequent exploratory design which was carried out in two phases, qualitative and quantitative, in 2018 year. In the first step, participants were selected by purposive sampling (n=34) to ensure representation of different job roles; hours worked, gender, grade, and age groups, and semi-structured interviews were used. All interviews were conducted in the workplace and were audio recorded, transcribed verbatim, and analyzed using the Strauss and Corbin approach. The interview question was, “what were their experiences of eating at work, and how could these nutritional habits affect their health in old age.” Finally, a total of 1500 basic codes were oriented at the open coding step, and they were merged together to create the 17 classes, and six concepts and a conceptual model were designed. The second phase of the study was conducted in the form of a cross-sectional study. After verification of the research tool, the developed questionnaire was examined in a group of employees. In order to test the conceptual model of the study, a total of 500 subjects were included in psychometry. Findings: Six main concepts have been known, including 1. undesirable control of stress, 2. lack of eating knowledge, 3. effect of the social network, 4. lack of motivation for healthy habits, 5. environmental-organizational intensifier, 6. unhealthy eating behaviors. The core concept was “Motivation Loss to do preventive behavior.” The main constructs of the motivational-based model for the promotion of eating habits are “modification and promote of eating habits,” increase of knowledge and competency, convey of healthy nutrition behavior culture and effecting of behavioral model especially in older age, desirable of control stress. Conclusion: A key factor for unhealthy eating behavior at the workplace is a lack of motivation, which can be an obstacle to conduct preventive behaviors at work that can affect the healthy aging process in the long term. The motivational-based model could be considered an effective conceptual framework and instrument for designing interventions for the promotion to create healthy and active aging.Keywords: aging, eating habits, older age, workplace
Procedia PDF Downloads 10113852 Predictions for the Anisotropy in Thermal Conductivity in Polymers Subjected to Model Flows by Combination of the eXtended Pom-Pom Model and the Stress-Thermal Rule
Authors: David Nieto Simavilla, Wilco M. H. Verbeeten
Abstract:
The viscoelastic behavior of polymeric flows under isothermal conditions has been extensively researched. However, most of the processing of polymeric materials occurs under non-isothermal conditions and understanding the linkage between the thermo-physical properties and the process state variables remains a challenge. Furthermore, the cost and energy required to manufacture, recycle and dispose polymers is strongly affected by the thermo-physical properties and their dependence on state variables such as temperature and stress. Experiments show that thermal conductivity in flowing polymers is anisotropic (i.e. direction dependent). This phenomenon has been previously omitted in the study and simulation of industrially relevant flows. Our work combines experimental evidence of a universal relationship between thermal conductivity and stress tensors (i.e. the stress-thermal rule) with differential constitutive equations for the viscoelastic behavior of polymers to provide predictions for the anisotropy in thermal conductivity in uniaxial, planar, equibiaxial and shear flow in commercial polymers. A particular focus is placed on the eXtended Pom-Pom model which is able to capture the non-linear behavior in both shear and elongation flows. The predictions provided by this approach are amenable to implementation in finite elements packages, since viscoelastic and thermal behavior can be described by a single equation. Our results include predictions for flow-induced anisotropy in thermal conductivity for low and high density polyethylene as well as confirmation of our method through comparison with a number of thermoplastic systems for which measurements of anisotropy in thermal conductivity are available. Remarkably, this approach allows for universal predictions of anisotropy in thermal conductivity that can be used in simulations of complex flows in which only the most fundamental rheological behavior of the material has been previously characterized (i.e. there is no need for additional adjusting parameters other than those in the constitutive model). Accounting for polymers anisotropy in thermal conductivity in industrially relevant flows benefits the optimization of manufacturing processes as well as the mechanical and thermal performance of finalized plastic products during use.Keywords: anisotropy, differential constitutive models, flow simulations in polymers, thermal conductivity
Procedia PDF Downloads 18213851 Speed Control of Brushless DC Motor Using PI Controller in MATLAB Simulink
Authors: Do Chi Thanh, Dang Ngoc Huy
Abstract:
Nowadays, there are more and more variable speed drive systems in small-scale and large-scale applications such as the electric vehicle industry, household appliances, medical equipment, and other industrial fields led to the development of BLDC (Brushless DC) motors. BLDC drive has many advantages, such as higher efficiency, better speed torque characteristics, high power density, and low maintenance cost compared to other conventional motors. Most BLDC motors use a proportional-integral (PI) controller and a pulse width modulation (PWM) scheme for speed control. This article describes the simulation model of BLDC motor drive control with the help of MATLAB - SIMULINK simulation software. The built simulation model includes a BLDC motor dynamic block, Hall sensor signal generation block, inverter converter block, and PI controller.Keywords: brushless DC motor, BLDC, six-step inverter, PI speed
Procedia PDF Downloads 7413850 Compressed Suffix Arrays to Self-Indexes Based on Partitioned Elias-Fano
Abstract:
A practical and simple self-indexing data structure, Partitioned Elias-Fano (PEF) - Compressed Suffix Arrays (CSA), is built in linear time for the CSA based on PEF indexes. Moreover, the PEF-CSA is compared with two classical compressed indexing methods, Ferragina and Manzini implementation (FMI) and Sad-CSA on different type and size files in Pizza & Chili. The PEF-CSA performs better on the existing data in terms of the compression ratio, count, and locates time except for the evenly distributed data such as proteins data. The observations of the experiments are that the distribution of the φ is more important than the alphabet size on the compression ratio. Unevenly distributed data φ makes better compression effect, and the larger the size of the hit counts, the longer the count and locate time.Keywords: compressed suffix array, self-indexing, partitioned Elias-Fano, PEF-CSA
Procedia PDF Downloads 25213849 Analysis Rotor Bearing System Dynamic Interaction with Bearing Supports
Abstract:
Frequently, in the design of machines, some of parameters that directly affect the rotor dynamics of the machines are not accurately known. In particular, bearing stiffness support is one such parameter. One of the most basic principles to grasp in rotor dynamics is the influence of the bearing stiffness on the critical speeds and mode shapes associated with a rotor-bearing system. Taking a rig shafting as an example, this paper studies the lateral vibration of the rotor with multi-degree-of-freedom by using Finite Element Method (FEM). The FEM model is created and the eigenvalues and eigenvectors are calculated and analyzed to find natural frequencies, critical speeds, mode shapes. Then critical speeds and mode shapes are analyzed by set bearing stiffness changes. The model permitted to identify the critical speeds and bearings that have an important influence on the vibration behavior.Keywords: lateral vibration, finite element method, rig shafting, critical speed
Procedia PDF Downloads 34013848 Determination of Effect Factor for Effective Parameter on Saccharification of Lignocellulosic Material by Concentrated Acid
Authors: Sina Aghili, Ali Arasteh Nodeh
Abstract:
Tamarisk usage as a new group of lignocelluloses material to produce fermentable sugars in bio-ethanol process was studied. The overall aim of this work was to establish the optimum condition for acid hydrolysis of this new material and a mathematical model predicting glucose release as a function of operation variable. Sulfuric acid concentration in the range of 20 to 60%(w/w), process temperature between 60 to 95oC, hydrolysis time from 120 to 240 min and solid content 5,10,15%(w/w) were used as hydrolysis conditions. HPLC was used to analysis of the product. This analysis indicated that glucose was the main fermentable sugar and was increased with time, temperature and solid content and acid concentration was a parabola influence in glucose production.The process was modeled by a quadratic equation. Curve study and model were found that 42% acid concentration, 15 % solid content and 90oC were in optimum condition.Keywords: fermentable sugar, saccharification, wood, hydrolysis
Procedia PDF Downloads 33413847 Colour and Curcuminoids Removal from Turmeric Wastewater Using Activated Carbon Adsorption
Authors: Nattawat Thongpraphai, Anusorn Boonpoke
Abstract:
This study aimed to determine the removal of colour and curcuminoids from turmeric wastewater using granular activated carbon (GAC) adsorption. The adsorption isotherm and kinetic behavior of colour and curcuminoids was invested using batch and fixed bed columns tests. The results indicated that the removal efficiency of colour and curcuminoids were 80.13 and 78.64%, respectively at 8 hr of equilibrium time. The adsorption isotherm of colour and curcuminoids were well fitted with the Freundlich adsorption model. The maximum adsorption capacity of colour and curcuminoids were 130 Pt-Co/g and 17 mg/g, respectively. The continuous experiment data showed that the exhaustion concentration of colour and curcuminoids occurred at 39 hr of operation time. The adsorption characteristic of colour and curcuminoids from turmeric wastewater by GAC can be described by the Thomas model. The maximum adsorption capacity obtained from kinetic approach were 39954 Pt-Co/g and 0.0516 mg/kg for colour and curcuminoids, respectively. Moreover, the decrease of colour and curcuminoids concentration during the service time showed a similar trend.Keywords: adsorption, turmeric, colour, curcuminoids, activated carbon
Procedia PDF Downloads 42513846 Robust Fault Diagnosis for Wind Turbine Systems Subjected to Multi-Faults
Authors: Sarah Odofin, Zhiwei Gao, Sun Kai
Abstract:
Operations, maintenance and reliability of wind turbines have received much attention over the years due to rapid expansion of wind farms. This paper explores early fault diagnosis scale technique based on a unique scheme of a 5MW wind turbine system that is optimized by genetic algorithm to be very sensitive to faults and resilient to disturbances. A quantitative model based analysis is pragmatic for primary fault diagnosis monitoring assessment to minimize downtime mostly caused by components breakdown and exploit productivity consistency. Simulation results are computed validating the wind turbine model which demonstrates system performance in a practical application of fault type examples. The results show the satisfactory effectiveness of the applied performance investigated in a Matlab/Simulink/Gatool environment.Keywords: disturbance robustness, fault monitoring and detection, genetic algorithm, observer technique
Procedia PDF Downloads 38013845 Human Errors in IT Services, HFACS Model in Root Cause Categorization
Authors: Kari Saarelainen, Marko Jantti
Abstract:
IT service trending of root causes of service incidents and problems is an important part of proactive problem management and service improvement. Human error related root causes are an important root cause category also in IT service management, although it’s proportion among root causes is smaller than in the other industries. The research problem in this study is: How root causes of incidents related to human errors should be categorized in an ITSM organization to effectively support service improvement. Categorization based on IT service management processes and based on Human Factors Analysis and Classification System (HFACS) taxonomy was studied in a case study. HFACS is widely used in human error root cause categorization across many industries. Combining these two categorization models in a two dimensional matrix was found effective, yet impractical for daily work.Keywords: IT service management, ITIL, incident, problem, HFACS, swiss cheese model
Procedia PDF Downloads 48913844 Fuzzy Logic Control for Flexible Joint Manipulator: An Experimental Implementation
Authors: Sophia Fry, Mahir Irtiza, Alexa Hoffman, Yousef Sardahi
Abstract:
This study presents an intelligent control algorithm for a flexible robotic arm. Fuzzy control is used to control the motion of the arm to maintain the arm tip at the desired position while reducing vibration and increasing the system speed of response. The Fuzzy controller (FC) is based on adding the tip angular position to the arm deflection angle and using their sum as a feedback signal to the control algorithm. This reduces the complexity of the FC in terms of the input variables, number of membership functions, fuzzy rules, and control structure. Also, the design of the fuzzy controller is model-free and uses only our knowledge about the system. To show the efficacy of the FC, the control algorithm is implemented on the flexible joint manipulator (FJM) developed by Quanser. The results show that the proposed control method is effective in terms of response time, overshoot, and vibration amplitude.Keywords: fuzzy logic control, model-free control, flexible joint manipulators, nonlinear control
Procedia PDF Downloads 11813843 A Cost Effective Approach to Develop Mid-Size Enterprise Software Adopted the Waterfall Model
Authors: Mohammad Nehal Hasnine, Md Kamrul Hasan Chayon, Md Mobasswer Rahman
Abstract:
Organizational tendencies towards computer-based information processing have been observed noticeably in the third-world countries. Many enterprises are taking major initiatives towards computerized working environment because of massive benefits of computer-based information processing. However, designing and developing information resource management software for small and mid-size enterprises under budget costs and strict deadline is always challenging for software engineers. Therefore, we introduced an approach to design mid-size enterprise software by using the Waterfall model, which is one of the SDLC (Software Development Life Cycles), in a cost effective way. To fulfill research objectives, in this study, we developed mid-sized enterprise software named “BSK Management System” that assists enterprise software clients with information resource management and perform complex organizational tasks. Waterfall model phases have been applied to ensure that all functions, user requirements, strategic goals, and objectives are met. In addition, Rich Picture, Structured English, and Data Dictionary have been implemented and investigated properly in engineering manner. Furthermore, an assessment survey with 20 participants has been conducted to investigate the usability and performance of the proposed software. The survey results indicated that our system featured simple interfaces, easy operation and maintenance, quick processing, and reliable and accurate transactions.Keywords: end-user application development, enterprise software design, information resource management, usability
Procedia PDF Downloads 43813842 Building User Behavioral Models by Processing Web Logs and Clustering Mechanisms
Authors: Madhuka G. P. D. Udantha, Gihan V. Dias, Surangika Ranathunga
Abstract:
Today Websites contain very interesting applications. But there are only few methodologies to analyze User navigations through the Websites and formulating if the Website is put to correct use. The web logs are only used if some major attack or malfunctioning occurs. Web Logs contain lot interesting dealings on users in the system. Analyzing web logs has become a challenge due to the huge log volume. Finding interesting patterns is not as easy as it is due to size, distribution and importance of minor details of each log. Web logs contain very important data of user and site which are not been put to good use. Retrieving interesting information from logs gives an idea of what the users need, group users according to their various needs and improve site to build an effective and efficient site. The model we built is able to detect attacks or malfunctioning of the system and anomaly detection. Logs will be more complex as volume of traffic and the size and complexity of web site grows. Unsupervised techniques are used in this solution which is fully automated. Expert knowledge is only used in validation. In our approach first clean and purify the logs to bring them to a common platform with a standard format and structure. After cleaning module web session builder is executed. It outputs two files, Web Sessions file and Indexed URLs file. The Indexed URLs file contains the list of URLs accessed and their indices. Web Sessions file lists down the indices of each web session. Then DBSCAN and EM Algorithms are used iteratively and recursively to get the best clustering results of the web sessions. Using homogeneity, completeness, V-measure, intra and inter cluster distance and silhouette coefficient as parameters these algorithms self-evaluate themselves to input better parametric values to run the algorithms. If a cluster is found to be too large then micro-clustering is used. Using Cluster Signature Module the clusters are annotated with a unique signature called finger-print. In this module each cluster is fed to Associative Rule Learning Module. If it outputs confidence and support as value 1 for an access sequence it would be a potential signature for the cluster. Then the access sequence occurrences are checked in other clusters. If it is found to be unique for the cluster considered then the cluster is annotated with the signature. These signatures are used in anomaly detection, prevent cyber attacks, real-time dashboards that visualize users, accessing web pages, predict actions of users and various other applications in Finance, University Websites, News and Media Websites etc.Keywords: anomaly detection, clustering, pattern recognition, web sessions
Procedia PDF Downloads 28813841 Effects of Mechanical Test and Shape of Grain Boundary on Martensitic Transformation in Fe-Ni-C Steel
Authors: Mounir Gaci, Salim Meziani, Atmane Fouathia
Abstract:
The purpose of the present paper is to model the behavior of metal alloy, type TRIP steel (Transformation Induced Plasticity), during solid/solid phase transition. A two-dimensional micromechanical model is implemented in finite element software (ZEBULON) to simulate the martensitic transformation in Fe-Ni-C steel grain under mechanical tensile stress of 250 MPa. The effects of non-uniform grain boundary and the criterion of mechanical shear load on the transformation and on the TRIP value during martensitic transformation are studied. The suggested mechanical criterion is favourable to the influence of the shear phenomenon on the progression of the martensitic transformation (Magee’s mechanism). The obtained results are in satisfactory agreement with experimental ones and show the influence of the grain boundary shape and the chosen mechanical criterion (SMF) on the transformation parameters.Keywords: martensitic transformation, non-uniform Grain Boundary, TRIP, shear Mechanical force (SMF)
Procedia PDF Downloads 26113840 Budgetary Performance Model for Managing Pavement Maintenance
Authors: Vivek Hokam, Vishrut Landge
Abstract:
An ideal maintenance program for an industrial road network is one that would maintain all sections at a sufficiently high level of functional and structural conditions. However, due to various constraints such as budget, manpower and equipment, it is not possible to carry out maintenance on all the needy industrial road sections within a given planning period. A rational and systematic priority scheme needs to be employed to select and schedule industrial road sections for maintenance. Priority analysis is a multi-criteria process that determines the best ranking list of sections for maintenance based on several factors. In priority setting, difficult decisions are required to be made for selection of sections for maintenance. It is more important to repair a section with poor functional conditions which includes uncomfortable ride etc. or poor structural conditions i.e. sections those are in danger of becoming structurally unsound. It would seem therefore that any rational priority setting approach must consider the relative importance of functional and structural condition of the section. The maintenance priority index and pavement performance models tend to focus mainly on the pavement condition, traffic criteria etc. There is a need to develop the model which is suitably used with respect to limited budget provisions for maintenance of pavement. Linear programming is one of the most popular and widely used quantitative techniques. A linear programming model provides an efficient method for determining an optimal decision chosen from a large number of possible decisions. The optimum decision is one that meets a specified objective of management, subject to various constraints and restrictions. The objective is mainly minimization of maintenance cost of roads in industrial area. In order to determine the objective function for analysis of distress model it is necessary to fix the realistic data into a formulation. Each type of repair is to be quantified in a number of stretches by considering 1000 m as one stretch. A stretch considered under study is having 3750 m length. The quantity has to be put into an objective function for maximizing the number of repairs in a stretch related to quantity. The distress observed in this stretch are potholes, surface cracks, rutting and ravelling. The distress data is measured manually by observing each distress level on a stretch of 1000 m. The maintenance and rehabilitation measured that are followed currently are based on subjective judgments. Hence, there is a need to adopt a scientific approach in order to effectively use the limited resources. It is also necessary to determine the pavement performance and deterioration prediction relationship with more accurate and economic benefits of road networks with respect to vehicle operating cost. The infrastructure of road network should have best results expected from available funds. In this paper objective function for distress model is determined by linear programming and deterioration model considering overloading is discussed.Keywords: budget, maintenance, deterioration, priority
Procedia PDF Downloads 20713839 Simulation of Surface Runoff in Mahabad Dam Basin, Iran
Authors: Leila Khosravi
Abstract:
A major part of the drinking water in North West of Iran is supplied from Mahabad reservoir 80 km northwest of Mahabad. This reservoir collects water from 750 km-catchment which is undergoing accelerated changes due to deforestation and urbanization. The main objective of this study is to develop a catchment modeling platform which translates ongoing land-use changes, soil data, precipitation and evaporation into surface runoff of the river discharging into the reservoir: Soil and Water Assessment Tool, SWAT, model along with hydro -meteorological records of 1997–2011. A variety of statistical indices were used to evaluate the simulation results for both calibration and validation periods; among them, the robust Nash–Sutcliffe coefficients were found to be 0.52 and 0.62 in the calibration and validation periods, respectively. This project has developed a reliable modeling platform with the benchmark land physical conditions of the Mahabad dam basin.Keywords: simulation, surface runoff, Mahabad dam, SWAT model
Procedia PDF Downloads 20613838 Method of Estimating Absolute Entropy of Municipal Solid Waste
Authors: Francis Chinweuba Eboh, Peter Ahlström, Tobias Richards
Abstract:
Entropy, as an outcome of the second law of thermodynamics, measures the level of irreversibility associated with any process. The identification and reduction of irreversibility in the energy conversion process helps to improve the efficiency of the system. The entropy of pure substances known as absolute entropy is determined at an absolute reference point and is useful in the thermodynamic analysis of chemical reactions; however, municipal solid waste (MSW) is a structurally complicated material with unknown absolute entropy. In this work, an empirical model to calculate the absolute entropy of MSW based on the content of carbon, hydrogen, oxygen, nitrogen, sulphur, and chlorine on a dry ash free basis (daf) is presented. The proposed model was derived from 117 relevant organic substances which represent the main constituents in MSW with known standard entropies using statistical analysis. The substances were divided into different waste fractions; namely, food, wood/paper, textiles/rubber and plastics waste and the standard entropies of each waste fraction and for the complete mixture were calculated. The correlation of the standard entropy of the complete waste mixture derived was found to be somsw= 0.0101C + 0.0630H + 0.0106O + 0.0108N + 0.0155S + 0.0084Cl (kJ.K-1.kg) and the present correlation can be used for estimating the absolute entropy of MSW by using the elemental compositions of the fuel within the range of 10.3% ≤ C ≤ 95.1%, 0.0% ≤ H ≤ 14.3%, 0.0% ≤ O ≤ 71.1%, 0.0 ≤ N ≤ 66.7%, 0.0% ≤ S ≤ 42.1%, 0.0% ≤ Cl ≤ 89.7%. The model is also applicable for the efficient modelling of a combustion system in a waste-to-energy plant.Keywords: absolute entropy, irreversibility, municipal solid waste, waste-to-energy
Procedia PDF Downloads 31013837 Understanding Chronic Pain: Missing the Mark
Authors: Rachid El Khoury
Abstract:
Chronic pain is perhaps the most burdensome health issue facing the planet. Our understanding of the pathophysiology of chronic pain has increased substantially over the past 25 years, including but not limited to changes in the brain. However, we still do not know why chronic pain develops in some people and not in others. Most of the recent developments in pain science, that have direct relevance to clinical management, relate to our understanding of the role of the brain, the role of the immune system, or the role of cognitive and behavioral factors. Although the Biopsychosocial model of pain management was presented decades ago, the Bio-reductionist model remains, unfortunately, at the heart of many practices across professional and geographic boundaries. A large body of evidence shows that nociception is neither sufficient nor necessary for pain. Pain is a conscious experience that can certainly be, and often is, associated with nociception, however, always modulated by countless neurobiological, environmental, and cognitive factors. This study will clarify the current misconceptions of chronic pain concepts, and their misperceptions by clinicians. It will also attempt to bridge the considerable gap between what we already know on pain but somehow disregarded, the development in pain science, and clinical practice.Keywords: chronic pain, nociception, biopsychosocial, neuroplasticity
Procedia PDF Downloads 63