Search results for: predictive accuracy
1889 Religious Fundamentalism Prescribes Requirements for Marriage and Reproduction
Authors: Steven M. Graham, Anne V. Magee
Abstract:
Most world religions have sacred texts and traditions that provide instruction about and definitions of marriage, family, and family duties and responsibilities. Given that religious fundamentalism (RF) is defined as the belief that these sacred texts and traditions are literally and completely true to the exclusion of other teachings, RF should be predictive of the attitudes one holds about these topics. The goals of the present research were to: (1) explore the extent to which people think that men and women can be happy without marriage, a significant sexual relationship, a long-term romantic relationship, and having children; (2) determine the extent to which RF is associated with these beliefs; and, (3) to determine how RF is associated with considering certain elements of a relationship to be necessary for thinking of that relationship as a marriage. In Study 1, participants completed a reliable and valid measure of RF and answered questions about the necessity of various elements for a happy life. Higher RF scores were associated with the belief that both men and women require marriage, a sexual relationship, a long-term romantic relationship, and children in order to have a happy life. In Study 2, participants completed these same measures and the pattern of results replicated when controlling for overall religiosity. That is, RF predicted these beliefs over and above religiosity. Additionally, participants indicated the extent to which a variety of characteristics were necessary to consider a particular relationship to be a marriage. Controlling for overall religiosity, higher RF scores were associated with the belief that the following were required to consider a relationship a marriage: religious sanctification, a sexual component, sexual monogamy, emotional monogamy, family approval, children (or the intent to have them), cohabitation, and shared finances. Interestingly, and unexpectedly, higher RF scores were correlated with less importance placed on mutual consent in order to consider a relationship a marriage. RF scores were uncorrelated with the importance placed on legal recognition or lifelong commitment and these null findings do not appear to be attributable to ceiling effects or lack of variability. These results suggest that RF constrains views about both the importance of marriage and family in one’s life and also the characteristics required to consider a relationship a proper marriage. This could have implications for the mental and physical health of believers high in RF, either positive or negative, depending upon the extent to which their lives correspond to these templates prescribed by RF. Additionally, some of these correlations with RF were substantial enough (> .70) that the relevant items could serve as a brief, unobtrusive measure of RF. Future research will investigate these possibilities.Keywords: attitudes about marriage, fertility intentions, measurement, religious fundamentalism
Procedia PDF Downloads 1181888 Pallet Tracking and Cost Optimization of the Flow of Goods in Logistics Operations by Serial Shipping Container Code
Authors: Dominika Crnjac Milic, Martina Martinovic, Vladimir Simovic
Abstract:
The case study method in this paper shows the implementation of Information Technology (IT) and the Serial Shipping Container Code (SSCC) in a Croatian company that deals with logistics operations and provides logistics services in the cold chain segment. This company is aware of the sensitivity of the goods entrusted to them by the user of the service, as well as of the importance of speed and accuracy in providing logistics services. To that end, it has implemented and used the latest IT to ensure the highest standard of high-quality logistics services to its customers. Looking for efficiency and optimization of supply chain management, while maintaining a high level of quality of the products that are sold, today's users of outsourced logistics services are open to the implementation of new IT products that ultimately deliver savings. By analysing the positive results and the difficulties that arise when using this technology, we aim to provide an insight into the potential of this approach of the logistics service provider.Keywords: logistics operations, serial shipping container code, information technology, cost optimization
Procedia PDF Downloads 3601887 Leveraging xAPI in a Corporate e-Learning Environment to Facilitate the Tracking, Modelling, and Predictive Analysis of Learner Behaviour
Authors: Libor Zachoval, Daire O Broin, Oisin Cawley
Abstract:
E-learning platforms, such as Blackboard have two major shortcomings: limited data capture as a result of the limitations of SCORM (Shareable Content Object Reference Model), and lack of incorporation of Artificial Intelligence (AI) and machine learning algorithms which could lead to better course adaptations. With the recent development of Experience Application Programming Interface (xAPI), a large amount of additional types of data can be captured and that opens a window of possibilities from which online education can benefit. In a corporate setting, where companies invest billions on the learning and development of their employees, some learner behaviours can be troublesome for they can hinder the knowledge development of a learner. Behaviours that hinder the knowledge development also raise ambiguity about learner’s knowledge mastery, specifically those related to gaming the system. Furthermore, a company receives little benefit from their investment if employees are passing courses without possessing the required knowledge and potential compliance risks may arise. Using xAPI and rules derived from a state-of-the-art review, we identified three learner behaviours, primarily related to guessing, in a corporate compliance course. The identified behaviours are: trying each option for a question, specifically for multiple-choice questions; selecting a single option for all the questions on the test; and continuously repeating tests upon failing as opposed to going over the learning material. These behaviours were detected on learners who repeated the test at least 4 times before passing the course. These findings suggest that gauging the mastery of a learner from multiple-choice questions test scores alone is a naive approach. Thus, next steps will consider the incorporation of additional data points, knowledge estimation models to model knowledge mastery of a learner more accurately, and analysis of the data for correlations between knowledge development and identified learner behaviours. Additional work could explore how learner behaviours could be utilised to make changes to a course. For example, course content may require modifications (certain sections of learning material may be shown to not be helpful to many learners to master the learning outcomes aimed at) or course design (such as the type and duration of feedback).Keywords: artificial intelligence, corporate e-learning environment, knowledge maintenance, xAPI
Procedia PDF Downloads 1211886 [Keynote Speech]: Simulation Studies of Pulsed Voltage Effects on Cells
Authors: Jiahui Song
Abstract:
In order to predict or explain a complicated biological process, it is important first to construct mathematical models that can be used to yield analytical solutions. Through numerical simulation, mathematical model results can be used to test scenarios that might not be easily attained in a laboratory experiment, or to predict parameters or phenomena. High-intensity, nanosecond pulse electroporation has been a recent development in bioelectrics. The dynamic pore model can be achieved by including a dynamic aspect and a dependence on the pore population density into pore formation energy equation to analyze and predict such electroporation effects. For greater accuracy, with inclusion of atomistic details, molecular dynamics (MD) simulations were also carried out during this study. Besides inducing pores in cells, external voltages could also be used in principle to modulate action potential generation in nerves. This could have an application in electrically controlled ‘pain management’. Also a simple model-based rate equation treatment of the various cellular bio-chemical processes has been used to predict the pulse number dependent cell survival trends.Keywords: model, high-intensity, nanosecond, bioelectrics
Procedia PDF Downloads 2261885 Determination of the Effective Economic and/or Demographic Indicators in Classification of European Union Member and Candidate Countries Using Partial Least Squares Discriminant Analysis
Authors: Esra Polat
Abstract:
Partial Least Squares Discriminant Analysis (PLSDA) is a statistical method for classification and consists a classical Partial Least Squares Regression (PLSR) in which the dependent variable is a categorical one expressing the class membership of each observation. PLSDA can be applied in many cases when classical discriminant analysis cannot be applied. For example, when the number of observations is low and when the number of independent variables is high. When there are missing values, PLSDA can be applied on the data that is available. Finally, it is adapted when multicollinearity between independent variables is high. The aim of this study is to determine the economic and/or demographic indicators, which are effective in grouping the 28 European Union (EU) member countries and 7 candidate countries (including potential candidates Bosnia and Herzegovina (BiH) and Kosova) by using the data set obtained from database of the World Bank for 2014. Leaving the political issues aside, the analysis is only concerned with the economic and demographic variables that have the potential influence on country’s eligibility for EU entrance. Hence, in this study, both the performance of PLSDA method in classifying the countries correctly to their pre-defined groups (candidate or member) and the differences between the EU countries and candidate countries in terms of these indicators are analyzed. As a result of the PLSDA, the value of percentage correctness of 100 % indicates that overall of the 35 countries is classified correctly. Moreover, the most important variables that determine the statuses of member and candidate countries in terms of economic indicators are identified as 'external balance on goods and services (% GDP)', 'gross domestic savings (% GDP)' and 'gross national expenditure (% GDP)' that means for the 2014 economical structure of countries is the most important determinant of EU membership. Subsequently, the model validated to prove the predictive ability by using the data set for 2015. For prediction sample, %97,14 of the countries are correctly classified. An interesting result is obtained for only BiH, which is still a potential candidate for EU, predicted as a member of EU by using the indicators data set for 2015 as a prediction sample. Although BiH has made a significant transformation from a war-torn country to a semi-functional state, ethnic tensions, nationalistic rhetoric and political disagreements are still evident, which inhibit Bosnian progress towards the EU.Keywords: classification, demographic indicators, economic indicators, European Union, partial least squares discriminant analysis
Procedia PDF Downloads 2801884 A Near-Optimal Domain Independent Approach for Detecting Approximate Duplicates
Authors: Abdelaziz Fellah, Allaoua Maamir
Abstract:
We propose a domain-independent merging-cluster filter approach complemented with a set of algorithms for identifying approximate duplicate entities efficiently and accurately within a single and across multiple data sources. The near-optimal merging-cluster filter (MCF) approach is based on the Monge-Elkan well-tuned algorithm and extended with an affine variant of the Smith-Waterman similarity measure. Then we present constant, variable, and function threshold algorithms that work conceptually in a divide-merge filtering fashion for detecting near duplicates as hierarchical clusters along with their corresponding representatives. The algorithms take recursive refinement approaches in the spirit of filtering, merging, and updating, cluster representatives to detect approximate duplicates at each level of the cluster tree. Experiments show a high effectiveness and accuracy of the MCF approach in detecting approximate duplicates by outperforming the seminal Monge-Elkan’s algorithm on several real-world benchmarks and generated datasets.Keywords: data mining, data cleaning, approximate duplicates, near-duplicates detection, data mining applications and discovery
Procedia PDF Downloads 3871883 Investigating Acute and Chronic Pain after Bariatric Surgery
Authors: Patti Kastanias, Wei Wang, Karyn Mackenzie, Sandra Robinson, Susan Wnuk
Abstract:
Obesity is a worldwide epidemic and is recognized as a chronic disease. Pain in the obese individual is a multidimensional issue. An increase in BMI is positively correlated with pain incidence and severity, especially in central obesity where individuals are twice as likely to have chronic pain. Both obesity and chronic pain are also associated with mood disorders. Pain is worse among obese individuals with depression and anxiety. Bariatric surgery provides patients with an effective solution for long-term weight loss and associated health problems. However, not much is known about acute and chronic pain after bariatric surgery and its contributing factors, including mood disorders. Nurse practitioners (NPs) at one large multidisciplinary bariatric surgery centre led two studies to examine acute and chronic pain and pain management over time after bariatric surgery. The purpose of the initial study was to examine the incidence and severity of acute and chronic pain after bariatric surgery. The aim of the secondary study was to further examine chronic pain, specifically looking at psychological factors that influence severity or incidence of both neuropathic and somatic pain as well as changes in opioid use. The initial study was a prospective, longitudinal study where patients having bariatric surgery at one surgical center were followed up to 6 months postop. Data was collected at 7 time points using validated instruments for pain severity, pain interference, and patient satisfaction. In the second study, subjects were followed longitudinally starting preoperatively and then at 6 months and 1 year postoperatively to capture changes in chronic pain and influencing variables over time. Valid and reliable instruments were utilized for all major study outcomes. In the first study, there was a trend towards decreased acute post-operative pain over time. The incidence and severity of chronic pain was found to be significantly reduced at 6 months post bariatric surgery. Interestingly, interference of chronic pain in daily life such as normal work, mood, and walking ability was significantly improved at 6 months postop however; this was not the case with sleep. Preliminary results of the secondary study indicate that pain severity, pain interference, anxiety and depression are significantly improved at 6 months postoperatively. In addition, preoperative anxiety, depression and emotional regulation were predictive of pain interference, but not pain severity. The results of our regression analyses provide evidence for the impact of pre-existing psychological factors on pain, particularly anxiety in obese populations.Keywords: bariatric surgery, mood disorders, obesity, pain
Procedia PDF Downloads 3041882 Cooling Profile Analysis of Hot Strip Coil Using Finite Volume Method
Authors: Subhamita Chakraborty, Shubhabrata Datta, Sujay Kumar Mukherjea, Partha Protim Chattopadhyay
Abstract:
Manufacturing of multiphase high strength steel in hot strip mill have drawn significant attention due to the possibility of forming low temperature transformation product of austenite under continuous cooling condition. In such endeavor, reliable prediction of temperature profile of hot strip coil is essential in order to accesses the evolution of microstructure at different location of hot strip coil, on the basis of corresponding Continuous Cooling Transformation (CCT) diagram. Temperature distribution profile of the hot strip coil has been determined by using finite volume method (FVM) vis-à-vis finite difference method (FDM). It has been demonstrated that FVM offer greater computational reliability in estimation of contact pressure distribution and hence the temperature distribution for curved and irregular profiles, owing to the flexibility in selection of grid geometry and discrete point position, Moreover, use of finite volume concept allows enforcing the conservation of mass, momentum and energy, leading to enhanced accuracy of prediction.Keywords: simulation, modeling, thermal analysis, coil cooling, contact pressure, finite volume method
Procedia PDF Downloads 4731881 Identifying Diabetic Retinopathy Complication by Predictive Techniques in Indian Type 2 Diabetes Mellitus Patients
Authors: Faiz N. K. Yusufi, Aquil Ahmed, Jamal Ahmad
Abstract:
Predicting the risk of diabetic retinopathy (DR) in Indian type 2 diabetes patients is immensely necessary. India, being the second largest country after China in terms of a number of diabetic patients, to the best of our knowledge not a single risk score for complications has ever been investigated. Diabetic retinopathy is a serious complication and is the topmost reason for visual impairment across countries. Any type or form of DR has been taken as the event of interest, be it mild, back, grade I, II, III, and IV DR. A sample was determined and randomly collected from the Rajiv Gandhi Centre for Diabetes and Endocrinology, J.N.M.C., A.M.U., Aligarh, India. Collected variables include patients data such as sex, age, height, weight, body mass index (BMI), blood sugar fasting (BSF), post prandial sugar (PP), glycosylated haemoglobin (HbA1c), diastolic blood pressure (DBP), systolic blood pressure (SBP), smoking, alcohol habits, total cholesterol (TC), triglycerides (TG), high density lipoprotein (HDL), low density lipoprotein (LDL), very low density lipoprotein (VLDL), physical activity, duration of diabetes, diet control, history of antihypertensive drug treatment, family history of diabetes, waist circumference, hip circumference, medications, central obesity and history of DR. Cox proportional hazard regression is used to design risk scores for the prediction of retinopathy. Model calibration and discrimination are assessed from Hosmer Lemeshow and area under receiver operating characteristic curve (ROC). Overfitting and underfitting of the model are checked by applying regularization techniques and best method is selected between ridge, lasso and elastic net regression. Optimal cut off point is chosen by Youden’s index. Five-year probability of DR is predicted by both survival function, and Markov chain two state model and the better technique is concluded. The risk scores developed can be applied by doctors and patients themselves for self evaluation. Furthermore, the five-year probabilities can be applied as well to forecast and maintain the condition of patients. This provides immense benefit in real application of DR prediction in T2DM.Keywords: Cox proportional hazard regression, diabetic retinopathy, ROC curve, type 2 diabetes mellitus
Procedia PDF Downloads 1861880 Dynamic Log Parsing and Intelligent Anomaly Detection Method Combining Retrieval Augmented Generation and Prompt Engineering
Authors: Liu Linxin
Abstract:
As system complexity increases, log parsing and anomaly detection become more and more important in ensuring system stability. However, traditional methods often face the problems of insufficient adaptability and decreasing accuracy when dealing with rapidly changing log contents and unknown domains. To this end, this paper proposes an approach LogRAG, which combines RAG (Retrieval Augmented Generation) technology with Prompt Engineering for Large Language Models, applied to log analysis tasks to achieve dynamic parsing of logs and intelligent anomaly detection. By combining real-time information retrieval and prompt optimisation, this study significantly improves the adaptive capability of log analysis and the interpretability of results. Experimental results show that the method performs well on several public datasets, especially in the absence of training data, and significantly outperforms traditional methods. This paper provides a technical path for log parsing and anomaly detection, demonstrating significant theoretical value and application potential.Keywords: log parsing, anomaly detection, retrieval-augmented generation, prompt engineering, LLMs
Procedia PDF Downloads 291879 Pod and Wavelets Application for Aerodynamic Design Optimization
Authors: Bonchan Koo, Junhee Han, Dohyung Lee
Abstract:
The research attempts to evaluate the accuracy and efficiency of a design optimization procedure which combines wavelets-based solution algorithm and proper orthogonal decomposition (POD) database management technique. Aerodynamic design procedure calls for high fidelity computational fluid dynamic (CFD) simulations and the consideration of large number of flow conditions and design constraints. Even with significant computing power advancement, current level of integrated design process requires substantial computing time and resources. POD reduces the degree of freedom of full system through conducting singular value decomposition for various field simulations. For additional efficiency improvement of the procedure, adaptive wavelet technique is also being employed during POD training period. The proposed design procedure was applied to the optimization of wing aerodynamic performance. Throughout the research, it was confirmed that the POD/wavelets design procedure could significantly reduce the total design turnaround time and is also able to capture all detailed complex flow features as in full order analysis.Keywords: POD (Proper Orthogonal Decomposition), wavelets, CFD, design optimization, ROM (Reduced Order Model)
Procedia PDF Downloads 4671878 Anomaly Detection with ANN and SVM for Telemedicine Networks
Authors: Edward Guillén, Jeisson Sánchez, Carlos Omar Ramos
Abstract:
In recent years, a wide variety of applications are developed with Support Vector Machines -SVM- methods and Artificial Neural Networks -ANN-. In general, these methods depend on intrusion knowledge databases such as KDD99, ISCX, and CAIDA among others. New classes of detectors are generated by machine learning techniques, trained and tested over network databases. Thereafter, detectors are employed to detect anomalies in network communication scenarios according to user’s connections behavior. The first detector based on training dataset is deployed in different real-world networks with mobile and non-mobile devices to analyze the performance and accuracy over static detection. The vulnerabilities are based on previous work in telemedicine apps that were developed on the research group. This paper presents the differences on detections results between some network scenarios by applying traditional detectors deployed with artificial neural networks and support vector machines.Keywords: anomaly detection, back-propagation neural networks, network intrusion detection systems, support vector machines
Procedia PDF Downloads 3571877 A Comparative Study of k-NN and MLP-NN Classifiers Using GA-kNN Based Feature Selection Method for Wood Recognition System
Authors: Uswah Khairuddin, Rubiyah Yusof, Nenny Ruthfalydia Rosli
Abstract:
This paper presents a comparative study between k-Nearest Neighbour (k-NN) and Multi-Layer Perceptron Neural Network (MLP-NN) classifier using Genetic Algorithm (GA) as feature selector for wood recognition system. The features have been extracted from the images using Grey Level Co-Occurrence Matrix (GLCM). The use of GA based feature selection is mainly to ensure that the database used for training the features for the wood species pattern classifier consists of only optimized features. The feature selection process is aimed at selecting only the most discriminating features of the wood species to reduce the confusion for the pattern classifier. This feature selection approach maintains the ‘good’ features that minimizes the inter-class distance and maximizes the intra-class distance. Wrapper GA is used with k-NN classifier as fitness evaluator (GA-kNN). The results shows that k-NN is the best choice of classifier because it uses a very simple distance calculation algorithm and classification tasks can be done in a short time with good classification accuracy.Keywords: feature selection, genetic algorithm, optimization, wood recognition system
Procedia PDF Downloads 5451876 Deep Reinforcement Learning and Generative Adversarial Networks Approach to Thwart Intrusions and Adversarial Attacks
Authors: Fabrice Setephin Atedjio, Jean-Pierre Lienou, Frederica F. Nelson, Sachin S. Shetty, Charles A. Kamhoua
Abstract:
Malicious users exploit vulnerabilities in computer systems, significantly disrupting their performance and revealing the inadequacies of existing protective solutions. Even machine learning-based approaches, designed to ensure reliability, can be compromised by adversarial attacks that undermine their robustness. This paper addresses two critical aspects of enhancing model reliability. First, we focus on improving model performance and robustness against adversarial threats. To achieve this, we propose a strategy by harnessing deep reinforcement learning. Second, we introduce an approach leveraging generative adversarial networks to counter adversarial attacks effectively. Our results demonstrate substantial improvements over previous works in the literature, with classifiers exhibiting enhanced accuracy in classification tasks, even in the presence of adversarial perturbations. These findings underscore the efficacy of the proposed model in mitigating intrusions and adversarial attacks within the machine-learning landscape.Keywords: machine learning, reliability, adversarial attacks, deep-reinforcement learning, robustness
Procedia PDF Downloads 91875 Experimental and Numerical Analyses of Tehran Research Reactor
Authors: A. Lashkari, H. Khalafi, H. Khazeminejad, S. Khakshourniya
Abstract:
In this paper, a numerical model is presented. The model is used to analyze a steady state thermo-hydraulic and reactivity insertion transient in TRR reference cores respectively. The model predictions are compared with the experiments and PARET code results. The model uses the piecewise constant and lumped parameter methods for the coupled point kinetics and thermal-hydraulics modules respectively. The advantages of the piecewise constant method are simplicity, efficiency and accuracy. A main criterion on the applicability range of this model is that the exit coolant temperature remains below the saturation temperature, i.e. no bulk boiling occurs in the core. The calculation values of power and coolant temperature, in steady state and positive reactivity insertion scenario, are in good agreement with the experiment values. However, the model is a useful tool for the transient analysis of most research reactor encountered in practice. The main objective of this work is using simple calculation methods and benchmarking them with experimental data. This model can be used for training proposes.Keywords: thermal-hydraulic, research reactor, reactivity insertion, numerical modeling
Procedia PDF Downloads 4011874 Human Identification and Detection of Suspicious Incidents Based on Outfit Colors: Image Processing Approach in CCTV Videos
Authors: Thilini M. Yatanwala
Abstract:
CCTV (Closed-Circuit-Television) Surveillance System is being used in public places over decades and a large variety of data is being produced every moment. However, most of the CCTV data is stored in isolation without having integrity. As a result, identification of the behavior of suspicious people along with their location has become strenuous. This research was conducted to acquire more accurate and reliable timely information from the CCTV video records. The implemented system can identify human objects in public places based on outfit colors. Inter-process communication technologies were used to implement the CCTV camera network to track people in the premises. The research was conducted in three stages and in the first stage human objects were filtered from other movable objects available in public places. In the second stage people were uniquely identified based on their outfit colors and in the third stage an individual was continuously tracked in the CCTV network. A face detection algorithm was implemented using cascade classifier based on the training model to detect human objects. HAAR feature based two-dimensional convolution operator was introduced to identify features of the human face such as region of eyes, region of nose and bridge of the nose based on darkness and lightness of facial area. In the second stage outfit colors of human objects were analyzed by dividing the area into upper left, upper right, lower left, lower right of the body. Mean color, mod color and standard deviation of each area were extracted as crucial factors to uniquely identify human object using histogram based approach. Color based measurements were written in to XML files and separate directories were maintained to store XML files related to each camera according to time stamp. As the third stage of the approach, inter-process communication techniques were used to implement an acknowledgement based CCTV camera network to continuously track individuals in a network of cameras. Real time analysis of XML files generated in each camera can determine the path of individual to monitor full activity sequence. Higher efficiency was achieved by sending and receiving acknowledgments only among adjacent cameras. Suspicious incidents such as a person staying in a sensitive area for a longer period or a person disappeared from the camera coverage can be detected in this approach. The system was tested for 150 people with the accuracy level of 82%. However, this approach was unable to produce expected results in the presence of group of people wearing similar type of outfits. This approach can be applied to any existing camera network without changing the physical arrangement of CCTV cameras. The study of human identification and suspicious incident detection using outfit color analysis can achieve higher level of accuracy and the project will be continued by integrating motion and gait feature analysis techniques to derive more information from CCTV videos.Keywords: CCTV surveillance, human detection and identification, image processing, inter-process communication, security, suspicious detection
Procedia PDF Downloads 1831873 Internal and External Overpressure Calculation for Vented Gas Explosion by Using a Combined Computational Fluid Dynamics Approach
Abstract:
Recent oil and gas accidents have reminded us the severe consequences of gas explosion on structure damage and financial loss. In order to protect the structures and personnel, engineers and researchers have been working on numerous different explosion mitigation methods. Amongst, venting is the most economical approach to mitigate gas explosion overpressure. In this paper, venting is used as the overpressure alleviation method. A theoretical method and a numerical technique are presented to predict the internal and external pressure from vented gas explosion in a large enclosure. Under idealized conditions, a number of experiments are used to calibrate the accuracy of the theoretically calculated data. A good agreement between the theoretical results and experimental data is seen. However, for realistic scenarios, the theoretical method over-estimates internal pressures and is incapable of predicting external pressures. Therefore, a CFD simulation procedure is proposed in this study to estimate both the internal and external overpressure from a large-scale vented explosion. Satisfactory agreement between CFD simulation results and experimental data is achieved.Keywords: vented gas explosion, internal pressure, external pressure, CFD simulation, FLACS, ANSYS Fluent
Procedia PDF Downloads 1611872 Modeling the Effect of Scale Deposition on Heat Transfer in Desalination Multi-Effect Distillation Evaporators
Authors: K. Bourouni, M. Chacha, T. Jaber, A. Tchantchane
Abstract:
In Multi-Effect Distillation (MED) desalination evaporators, the scale deposit outside the tubes presents a barrier to heat transfers reducing the global heat transfer coefficient and causing a decrease in water production; hence a loss of efficiency and an increase in operating and maintenance costs. Scale removal (by acid cleaning) is the main maintenance operation and constitutes the major reason for periodic plant shutdowns. A better understanding of scale deposition mechanisms will lead to an accurate determination of the variation of scale thickness around the tubes and an improved accuracy of the overall heat transfer coefficient calculation. In this paper, a coupled heat transfer-calcium carbonate scale deposition model on a horizontal tube bundle is presented. The developed tool is used to determine precisely the heat transfer area leading to a significant cost reduction for a given water production capacity. Simulations are carried to investigate the influence of different parameters such as water salinity, temperature, etc. on the heat transfer.Keywords: multi-effect-evaporator, scale deposition, water desalination, heat transfer coefficient
Procedia PDF Downloads 1511871 Evaluating Forecasts Through Stochastic Loss Order
Authors: Wilmer Osvaldo Martinez, Manuel Dario Hernandez, Juan Manuel Julio
Abstract:
We propose to assess the performance of k forecast procedures by exploring the distributions of forecast errors and error losses. We argue that non systematic forecast errors minimize when their distributions are symmetric and unimodal, and that forecast accuracy should be assessed through stochastic loss order rather than expected loss order, which is the way it is customarily performed in previous work. Moreover, since forecast performance evaluation can be understood as a one way analysis of variance, we propose to explore loss distributions under two circumstances; when a strict (but unknown) joint stochastic order exists among the losses of all forecast alternatives, and when such order happens among subsets of alternative procedures. In spite of the fact that loss stochastic order is stronger than loss moment order, our proposals are at least as powerful as competing tests, and are robust to the correlation, autocorrelation and heteroskedasticity settings they consider. In addition, since our proposals do not require samples of the same size, their scope is also wider, and provided that they test the whole loss distribution instead of just loss moments, they can also be used to study forecast distributions as well. We illustrate the usefulness of our proposals by evaluating a set of real world forecasts.Keywords: forecast evaluation, stochastic order, multiple comparison, non parametric test
Procedia PDF Downloads 891870 Labview-Based System for Fiber Links Events Detection
Authors: Bo Liu, Qingshan Kong, Weiqing Huang
Abstract:
With the rapid development of modern communication, diagnosing the fiber-optic quality and faults in real-time is widely focused. In this paper, a Labview-based system is proposed for fiber-optic faults detection. The wavelet threshold denoising method combined with Empirical Mode Decomposition (EMD) is applied to denoise the optical time domain reflectometer (OTDR) signal. Then the method based on Gabor representation is used to detect events. Experimental measurements show that signal to noise ratio (SNR) of the OTDR signal is improved by 1.34dB on average, compared with using the wavelet threshold denosing method. The proposed system has a high score in event detection capability and accuracy. The maximum detectable fiber length of the proposed Labview-based system can be 65km.Keywords: empirical mode decomposition, events detection, Gabor transform, optical time domain reflectometer, wavelet threshold denoising
Procedia PDF Downloads 1231869 DGA Data Interpretation Using Extension Theory for Power Transformer Diagnostics
Authors: O. P. Rahi, Manoj Kumar
Abstract:
Power transformers are essential and expensive equipments in electrical power system. Dissolved gas analysis (DGA) is one of the most useful techniques to detect incipient faults in power transformers. However, the identification of the faulted location by conventional method is not always an easy task due to variability of gas data and operational variables. In this paper, an extension theory based power transformer fault diagnosis method is presented. Extension theory tries to solve contradictions and incompatibility problems. This paper first briefly introduces the basic concept of matter element theory, establishes the matter element models for three-ratio method, and then briefly discusses extension set theory. Detailed analysis is carried out on the extended relation function (ERF) adopted in this paper for transformer fault diagnosis. The detailed diagnosing steps are offered. Simulation proves that the proposed method can overcome the drawbacks of the conventional three-ratio method, such as no matching and failure to diagnose multi-fault. It enhances diagnosing accuracy.Keywords: DGA, extension theory, ERF, fault diagnosis power transformers, fault diagnosis, fuzzy logic
Procedia PDF Downloads 4121868 The Modeling and Effectiveness Evaluation for Vessel Evasion to Acoustic Homing Torpedo
Authors: Li Minghui, Min Shaorong, Zhang Jun
Abstract:
This paper aims for studying the operational efficiency of surface warship’s motorized evasion to acoustic homing torpedo. It orderly developed trajectory model, self-guide detection model, vessel evasion model, as well as anti-torpedo error model in three-dimensional space to make up for the deficiency of precious researches analyzing two-dimensionally confrontational models. Then, making use of the Monte Carlo method, it carried out the simulation for the confrontation process of evasion in the environment of MATLAB. At last, it quantitatively analyzed the main factors which determine vessel’s survival probability. The results show that evasion relative bearing and speed will affect vessel’s survival probability significantly. Thus, choosing appropriate evasion relative bearing and speed according to alarming range and alarming relative bearing for torpedo, improving alarming range and positioning accuracy and reducing the response time against torpedo will improve the vessel’s survival probability significantly.Keywords: acoustic homing torpedo, vessel evasion, monte carlo method, torpedo defense, vessel's survival probability
Procedia PDF Downloads 4551867 A Dynamic Approach for Evaluating the Climate Change Risks on Building Performance
Authors: X. Lu, T. Lu, S. Javadi
Abstract:
A simple dynamic approach is presented for analyzing thermal and moisture dynamics of buildings, which is of particular relevance to understanding climate change impacts on buildings, including assessment of risks and applications of resilience strategies. With the goal to demonstrate the proposed modeling methodology, to verify the model, and to show that wooden materials provide a mechanism that can facilitate the reduction of moisture risks and be more resilient to global warming, a wooden church equipped with high precision measurement systems was taken as a test building for full-scale time-series measurements. Sensitivity analyses indicate a high degree of accuracy in the model prediction regarding the indoor environment. The model is then applied to a future projection of climate indoors aiming to identify significant environmental factors, the changing temperature and humidity, and effective response to the climate change impacts. The paper suggests that wooden building materials offer an effective and resilient response to anticipated future climate changes.Keywords: dynamic model, forecast, climate change impact, wooden structure, buildings
Procedia PDF Downloads 1511866 Information System Development for Online Journal System Using Online Journal System for Journal Management of Suan Sunandha Rajabhat University
Authors: Anuphan Suttimarn, Natcha Wattanaprapa, Suwaree Yordchim
Abstract:
The aim of this study is to develop the online journal system using a web application to manage the journal service of Suan Sunandha Rajabhat University in order to improve the journal management of the university. The main structures of the system process consist of 1. journal content management system 2. membership system of the journal and 3. online submission or review process. The investigators developed the system based on a web application using open source OJS software and phpMyAdmin to manage a research database. The system test showed that this online system 'Online Journal System (OJS)' could shorten the time in the period of submission article to journal and helped in managing a journal procedure efficiently and accurately. The quality evaluation of Suan Sunandha Rajabhat online journal system (SSRUOJS) undertaken by experts and researchers in 5 aspects; design, usability, security, reducing time, and accuracy showed the highest average value (X=4.30) on the aspect of reducing time. Meanwhile, the system efficiency evaluation was on an excellent level (X=4.13).Keywords: online journal system, Journal management, Information system development, OJS
Procedia PDF Downloads 1751865 Comparison between Hardy-Cross Method and Water Software to Solve a Pipe Networking Design Problem for a Small Town
Authors: Ahmed Emad Ahmed, Zeyad Ahmed Hussein, Mohamed Salama Afifi, Ahmed Mohammed Eid
Abstract:
Water has a great importance in life. In order to deliver water from resources to the users, many procedures should be taken by the water engineers. One of the main procedures to deliver water to the community is by designing pressurizer pipe networks for water. The main aim of this work is to calculate the water demand of a small town and then design a simple water network to distribute water resources among the town with the smallest losses. Literature has been mentioned to cover the main point related to water distribution. Moreover, the methodology has introduced two approaches to solve the research problem, one by the iterative method of Hardy-cross and the other by water software Pipe Flow. The results have introduced two main designs to satisfy the same research requirements. Finally, the researchers have concluded that the use of water software provides more abilities and options for water engineers.Keywords: looping pipe networks, hardy cross networks accuracy, relative error of hardy cross method
Procedia PDF Downloads 1651864 Cost-Effective Indoor-Air Quality (IAQ) Monitoring via Cavity Enhanced Photoacoustic Technology
Authors: Jifang Tao, Fei Gao, Hong Cai, Yuan Jin Zheng, Yuan Dong Gu
Abstract:
Photoacoustic technology is used to measure effect absorption of a light by means of acoustic detection, which provides a high sensitive, low-cross response, cost-effective solution for gas molecular detection. In this paper, we proposed an integrated photoacoustic sensor for Indoor-air quality (IAQ) monitoring. The sensor consists of an acoustically resonant cavity, a high silicon acoustic transducer chip, and a low-cost light source. The light is modulated at the resonant frequency of the cavity to create an enhanced periodic heating and result in an amplified acoustic pressure wave. The pressure is readout by a novel acoustic transducer with low noise. Based on this photoacoustic sensor, typical indoor gases, including CO2, CO, O2, and H2O have been successfully detected, and their concentration are also evaluated with very high accuracy. It has wide potential applications in IAQ monitoring for agriculture, food industry, and ventilation control systems used in public places, such as schools, hospitals and airports.Keywords: indoor-air quality (IAQ) monitoring, photoacoustic gas sensor, cavity enhancement, integrated gas sensor
Procedia PDF Downloads 6581863 An Innovative Green Cooling Approach Using Peltier Chip in Milling Operation for Surface Roughness Improvement
Authors: Md. Anayet U. Patwari, Mohammad Ahsan Habib, Md. Tanzib Ehsan, Md Golam Ahnaf, Md. S. I. Chowdhury
Abstract:
Surface roughness is one of the key quality parameters of the finished product. During any machining operation, high temperatures are generated at the tool-chip interface impairing surface quality and dimensional accuracy of products. Cutting fluids are generally applied during machining to reduce temperature at the tool-chip interface. However, usages of cutting fluids give rise to problems such as waste disposal, pollution, high cost, and human health hazard. Researchers, now-a-days, are opting towards dry machining and other cooling techniques to minimize use of coolants during machining while keeping surface roughness of products within desirable limits. In this paper, a concept of using peltier cooling effects during aluminium milling operation has been presented and adopted with an aim to improve surface roughness of the machined surface. Experimental evidence shows that peltier cooling effect provides better surface roughness of the machined surface compared to dry machining.Keywords: aluminium, milling operation, peltier cooling effect, surface roughness
Procedia PDF Downloads 3371862 Neural Network Approach for Solving Integral Equations
Authors: Bhavini Pandya
Abstract:
This paper considers Hη: T2 → T2 the Perturbed Cerbelli-Giona map. That is a family of 2-dimensional nonlinear area-preserving transformations on the torus T2=[0,1]×[0,1]= ℝ2/ ℤ2. A single parameter η varies between 0 and 1, taking the transformation from a hyperbolic toral automorphism to the “Cerbelli-Giona” map, a system known to exhibit multifractal properties. Here we study the multifractal properties of the family of maps. We apply a box-counting method by defining a grid of boxes Bi(δ), where i is the index and δ is the size of the boxes, to quantify the distribution of stable and unstable manifolds of the map. When the parameter is in the range 0.51< η <0.58 and 0.68< η <1 the map is ergodic; i.e., the unstable and stable manifolds eventually cover the whole torus, although not in a uniform distribution. For accurate numerical results we require correspondingly accurate construction of the stable and unstable manifolds. Here we use the piecewise linearity of the map to achieve this, by computing the endpoints of line segments which define the global stable and unstable manifolds. This allows the generalized fractal dimension Dq, and spectrum of dimensions f(α), to be computed with accuracy. Finally, the intersection of the unstable and stable manifold of the map will be investigated, and compared with the distribution of periodic points of the system.Keywords: feed forward, gradient descent, neural network, integral equation
Procedia PDF Downloads 1891861 Numerical Investigation for External Strengthening of Dapped-End Beams
Authors: A. Abdel-Moniem, H. Madkour, K. Farah, A. Abdullah
Abstract:
The reduction in dapped end beams depth nearby the supports tends to produce stress concentration and hence results in shear cracks, if it does not have an adequate reinforcement detailing. This study investigates numerically the efficiency of applying different external strengthening techniques to the dapped end of such beams. A two-dimensional finite element model was built to predict the structural behavior of dapped ends strengthened with different techniques. The techniques included external bonding of the steel angle at the re-entrant corner, un-bounded bolt anchoring, external steel plate jacketing, exterior carbon fiber wrapping and/or stripping and external inclined steel plates. The FE analysis results are then presented in terms of the ultimate load capacities, load-deflection and crack pattern at failure. The results showed that the FE model, at various stages, was found to be comparable to the available test data. Moreover, it enabled the capture of the failure progress, with acceptable accuracy, which is very difficult in a laboratory test.Keywords: dapped-end beams, finite element, shear failure, strengthening techniques, reinforced concrete, numerical investigation
Procedia PDF Downloads 1171860 A Family of Second Derivative Methods for Numerical Integration of Stiff Initial Value Problems in Ordinary Differential Equations
Authors: Luke Ukpebor, C. E. Abhulimen
Abstract:
Stiff initial value problems in ordinary differential equations are problems for which a typical solution is rapidly decaying exponentially, and their numerical investigations are very tedious. Conventional numerical integration solvers cannot cope effectively with stiff problems as they lack adequate stability characteristics. In this article, we developed a new family of four-step second derivative exponentially fitted method of order six for the numerical integration of stiff initial value problem of general first order differential equations. In deriving our method, we employed the idea of breaking down the general multi-derivative multistep method into predator and corrector schemes which possess free parameters that allow for automatic fitting into exponential functions. The stability analysis of the method was discussed and the method was implemented with numerical examples. The result shows that the method is A-stable and competes favorably with existing methods in terms of efficiency and accuracy.Keywords: A-stable, exponentially fitted, four step, predator-corrector, second derivative, stiff initial value problems
Procedia PDF Downloads 258