Search results for: validation process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16259

Search results for: validation process

15839 Use of In-line Data Analytics and Empirical Model for Early Fault Detection

Authors: Hyun-Woo Cho

Abstract:

Automatic process monitoring schemes are designed to give early warnings for unusual process events or abnormalities as soon as possible. For this end, various techniques have been developed and utilized in various industrial processes. It includes multivariate statistical methods, representation skills in reduced spaces, kernel-based nonlinear techniques, etc. This work presents a nonlinear empirical monitoring scheme for batch type production processes with incomplete process measurement data. While normal operation data are easy to get, unusual fault data occurs infrequently and thus are difficult to collect. In this work, noise filtering steps are added in order to enhance monitoring performance by eliminating irrelevant information of the data. The performance of the monitoring scheme was demonstrated using batch process data. The results showed that the monitoring performance was improved significantly in terms of detection success rate of process fault.

Keywords: batch process, monitoring, measurement, kernel method

Procedia PDF Downloads 323
15838 Artificial Neural Network for Forecasting of Daily Reservoir Inflow: Case Study of the Kotmale Reservoir in Sri Lanka

Authors: E. U. Dampage, Ovindi D. Bandara, Vinushi S. Waraketiya, Samitha S. R. De Silva, Yasiru S. Gunarathne

Abstract:

The knowledge of water inflow figures is paramount in decision making on the allocation for consumption for numerous purposes; irrigation, hydropower, domestic and industrial usage, and flood control. The understanding of how reservoir inflows are affected by different climatic and hydrological conditions is crucial to enable effective water management and downstream flood control. In this research, we propose a method using a Long Short Term Memory (LSTM) Artificial Neural Network (ANN) to assist the aforesaid decision-making process. The Kotmale reservoir, which is the uppermost reservoir in the Mahaweli reservoir complex in Sri Lanka, was used as the test bed for this research. The ANN uses the runoff in the Kotmale reservoir catchment area and the effect of Sea Surface Temperatures (SST) to make a forecast for seven days ahead. Three types of ANN are tested; Multi-Layer Perceptron (MLP), Convolutional Neural Network (CNN), and LSTM. The extensive field trials and validation endeavors found that the LSTM ANN provides superior performance in the aspects of accuracy and latency.

Keywords: convolutional neural network, CNN, inflow, long short-term memory, LSTM, multi-layer perceptron, MLP, neural network

Procedia PDF Downloads 151
15837 Storage System Validation Study for Raw Cocoa Beans Using Minitab® 17 and R (R-3.3.1)

Authors: Anthony Oppong Kyekyeku, Sussana Antwi-Boasiako, Emmanuel De-Graft Johnson Owusu Ansah

Abstract:

In this observational study, the performance of a known conventional storage system was tested and evaluated for fitness for its intended purpose. The system has a scope extended for the storage of dry cocoa beans. System sensitivity, reproducibility and uncertainties are not known in details. This study discusses the system performance in the context of existing literature on factors that influence the quality of cocoa beans during storage. Controlled conditions were defined precisely for the system to give reliable base line within specific established procedures. Minitab® 17 and R statistical software (R-3.3.1) were used for the statistical analyses. The approach to the storage system testing was to observe and compare through laboratory test methods the quality of the cocoa beans samples before and after storage. The samples were kept in Kilner jars and the temperature of the storage environment controlled and monitored over a period of 408 days. Standard test methods use in international trade of cocoa such as the cut test analysis, moisture determination with Aqua boy KAM III model and bean count determination were used for quality assessment. The data analysis assumed the entire population as a sample in order to establish a reliable baseline to the data collected. The study concluded a statistically significant mean value at 95% Confidence Interval (CI) for the performance data analysed before and after storage for all variables observed. Correlational graphs showed a strong positive correlation for all variables investigated with the exception of All Other Defect (AOD). The weak relationship between the before and after data for AOD had an explained variability of 51.8% with the unexplained variability attributable to the uncontrolled condition of hidden infestation before storage. The current study concluded with a high-performance criterion for the storage system.

Keywords: benchmarking performance data, cocoa beans, hidden infestation, storage system validation

Procedia PDF Downloads 174
15836 External Validation of Risk Prediction Score for Candidemia in Critically Ill Patients: A Retrospective Observational Study

Authors: Nurul Mazni Abdullah, Saw Kian Cheah, Raha Abdul Rahman, Qurratu 'Aini Musthafa

Abstract:

Purpose: Candidemia was associated with high mortality in the critically ill patients. Early candidemia prediction is imperative for preemptive antifungal treatment. This study aimed to externally validate the candidemia risk prediction scores by Jameran et al. (2021) by identifying risk factors of acute kidney injury, renal replacement therapy, parenteral nutrition, and multifocal candida colonization. Methods: This single-center, retrospective observational study included all critically ill patients admitted to the intensive care unit (ICU) in a tertiary referral center from January 2018 to December 2023. The study evaluated the candidemia risk prediction score performance by analysing the occurrence of candidemia within the study period. Patients’ demographic characteristics, comorbidities, SOFA scores, and ICU outcomes were analyzed. Patients who were diagnosed with candidemia prior to ICU admission were excluded. Results: A total of 500 patients were analyzed with 2 dropouts due to incomplete data. Validation analysis showed that the candidemia risk prediction score has a sensitivity of 75.00% (95% CI: 59.66-86.81), specificity of 65.35% (95% CI: 60.78-69.72), positive predictive value of 17.28, and negative predictive value of 96.44. The incidence of candidemia was 8.86%, with no significant differences in demographics or comorbidities except for higher SOFA scoring in the candidemia group. The candidemia group showed significantly longer ICU, hospital LOS, and higher ICU in-hospital mortality. Conclusion: This study concluded the candidemia risk prediction score by Jameran et al. (2021) had good sensitivity and a high negative prediction value. Thus, the risk prediction score was validated for candidemia prediction in critically ill patients.

Keywords: Candidemia, intensive care, acute kidney injury, clinical prediction rule, incidence

Procedia PDF Downloads 7
15835 Implications of Optimisation Algorithm on the Forecast Performance of Artificial Neural Network for Streamflow Modelling

Authors: Martins Y. Otache, John J. Musa, Abayomi I. Kuti, Mustapha Mohammed

Abstract:

The performance of an artificial neural network (ANN) is contingent on a host of factors, for instance, the network optimisation scheme. In view of this, the study examined the general implications of the ANN training optimisation algorithm on its forecast performance. To this end, the Bayesian regularisation (Br), Levenberg-Marquardt (LM), and the adaptive learning gradient descent: GDM (with momentum) algorithms were employed under different ANN structural configurations: (1) single-hidden layer, and (2) double-hidden layer feedforward back propagation network. Results obtained revealed generally that the gradient descent with momentum (GDM) optimisation algorithm, with its adaptive learning capability, used a relatively shorter time in both training and validation phases as compared to the Levenberg- Marquardt (LM) and Bayesian Regularisation (Br) algorithms though learning may not be consummated; i.e., in all instances considering also the prediction of extreme flow conditions for 1-day and 5-day ahead, respectively especially using the ANN model. In specific statistical terms on the average, model performance efficiency using the coefficient of efficiency (CE) statistic were Br: 98%, 94%; LM: 98 %, 95 %, and GDM: 96 %, 96% respectively for training and validation phases. However, on the basis of relative error distribution statistics (MAE, MAPE, and MSRE), GDM performed better than the others overall. Based on the findings, it is imperative to state that the adoption of ANN for real-time forecasting should employ training algorithms that do not have computational overhead like the case of LM that requires the computation of the Hessian matrix, protracted time, and sensitivity to initial conditions; to this end, Br and other forms of the gradient descent with momentum should be adopted considering overall time expenditure and quality of the forecast as well as mitigation of network overfitting. On the whole, it is recommended that evaluation should consider implications of (i) data quality and quantity and (ii) transfer functions on the overall network forecast performance.

Keywords: streamflow, neural network, optimisation, algorithm

Procedia PDF Downloads 152
15834 Computer Simulation Approach in the 3D Printing Operations of Surimi Paste

Authors: Timilehin Martins Oyinloye, Won Byong Yoon

Abstract:

Simulation technology is being adopted in many industries, with research focusing on the development of new ways in which technology becomes embedded within production, services, and society in general. 3D printing (3DP) technology is fast developing in the food industry. However, the limited processability of high-performance material restricts the robustness of the process in some cases. Significantly, the printability of materials becomes the foundation for extrusion-based 3DP, with residual stress being a major challenge in the printing of complex geometry. In many situations, the trial-a-error method is being used to determine the optimum printing condition, which results in time and resource wastage. In this report, the analysis of 3 moisture levels for surimi paste was investigated for an optimum 3DP material and printing conditions by probing its rheology, flow characteristics in the nozzle, and post-deposition process using the finite element method (FEM) model. Rheological tests revealed that surimi pastes with 82% moisture are suitable for 3DP. According to the FEM model, decreasing the nozzle diameter from 1.2 mm to 0.6 mm, increased the die swell from 9.8% to 14.1%. The die swell ratio increased due to an increase in the pressure gradient (1.15107 Pa to 7.80107 Pa) at the nozzle exit. The nozzle diameter influenced the fluid properties, i.e., the shear rate, velocity, and pressure in the flow field, as well as the residual stress and the deformation of the printed sample, according to FEM simulation. The post-printing stability of the model was investigated using the additive layer manufacturing (ALM) model. The ALM simulation revealed that the residual stress and total deformation of the sample were dependent on the nozzle diameter. A small nozzle diameter (0.6 mm) resulted in a greater total deformation (0.023), particularly at the top part of the model, which eventually resulted in the sample collapsing. As the nozzle diameter increased, the accuracy of the model improved until the optimum nozzle size (1.0 mm). Validation with 3D-printed surimi products confirmed that the nozzle diameter was a key parameter affecting the geometry accuracy of 3DP of surimi paste.

Keywords: 3D printing, deformation analysis, die swell, numerical simulation, surimi paste

Procedia PDF Downloads 67
15833 Information Technology for Business Process Management in Insurance Companies

Authors: Vesna Bosilj Vukšić, Darija Ivandić Vidović, Ljubica Milanović Glavan

Abstract:

Information technology plays an irreplaceable role in introducing and improving business process orientation in a company. It enables implementation of the theoretical concept, measurement of results achieved and undertaking corrective measures aimed at improvements. Information technology is a key concept in the development and implementation of the business process management systems as it establishes a connection to business operations. Both in the literature and practice, insurance companies are often seen as highly process oriented due to the nature of their business and focus on customers. They are also considered leaders in using information technology for business process management. The research conducted aimed to investigate whether the perceived leadership status of insurance companies is well deserved, i.e. to establish the level of process orientation and explore the practice of information technology use in insurance companies in the region. The main instrument for primary data collection within this research was an electronic survey questionnaire sent to the management of insurance companies in the Republic of Croatia, Bosnia and Herzegovina, Slovenia, Serbia and Macedonia. The conducted research has shown that insurance companies have a satisfactory level of process orientation, but that there is also a huge potential for improvement, especially in the segment of information technology and its connection to business processes.

Keywords: business processes management, process orientation, information technology, insurance companies

Procedia PDF Downloads 381
15832 Design and Validation of Cutting Performance of Ceramic Matrix Composites Using FEM Simulations

Authors: Zohaib Ellahi, Guolong Zhao

Abstract:

Ceramic matrix composite (CMC) material possesses high strength, wear resistance and anisotropy thus machining of this material is very difficult and demands high cost. In this research, FEM simulations and physical experiments have been carried out to assess the machinability of carbon fiber reinforced silicon carbide (C/SiC) using polycrystalline diamond (PCD) tool in slot milling process. Finite element model has been generated in Abaqus/CAE software and milling operation performed by using user defined material subroutine. Effect of different milling parameters on cutting forces and stresses has been calculated through FEM simulations and compared with experimental results to validate the finite element model. Cutting forces in x and y-direction were calculated through both experiments and finite element model and found a good agreement between them. With increase in cutting speed resultant cutting forces are decreased. Resultant cutting forces are increased with increased feed per tooth and depth of cut. When machining performed along the fiber direction stresses generated near the tool edge were minimum and increases with fiber cutting angle.

Keywords: experimental & numerical investigation, C/SiC cutting performance analysis, milling of CMCs, CMC composite stress analysis

Procedia PDF Downloads 86
15831 The Energy Efficient Water Reuse by Combination of Nano-Filtration and Capacitive Deionization Processes

Authors: Youngmin Kim, Jae-Hwan Ahn, Seog-Ku Kim, Hye-Cheol Oh, Bokjin Lee, Hee-Jun Kang

Abstract:

The high energy consuming processes such as advanced oxidation and reverse osmosis are used as a reuse process. This study aims at developing an energy efficient reuse process by combination of nanofiltration (NF) and capacitive deionization processes (CDI) processes. Lab scale experiments were conducted by using effluents from a wastewater treatment plant located at Koyang city in Korea. Commercial NF membrane (NE4040-70, Toray Ltd.) and CDI module (E40, Siontech INC.) were tested in series. The pollutant removal efficiencies were evaluated on the basis of Korean water quality criteria for water reuse. In addition, the energy consumptions were also calculated. As a result, the hybrid process showed lower energy consumption than conventional reverse osmosis process even though its effluent did meet the Korean standard. Consequently, this study suggests that the hybrid process is feasible for the energy efficient water reuse.

Keywords: capacitive deionization, energy efficient process, nanofiltration, water reuse

Procedia PDF Downloads 182
15830 Stealth Laser Dicing Process Improvement via Shuffled Frog Leaping Algorithm

Authors: Pongchanun Luangpaiboon, Wanwisa Sarasang

Abstract:

In this paper, a performance of shuffled frog leaping algorithm was investigated on the stealth laser dicing process. Effect of problem on the performance of the algorithm was based on the tolerance of meandering data. From the customer specification it could be less than five microns with the target of zero microns. Currently, the meandering levels are unsatisfactory when compared to the customer specification. Firstly, the two-level factorial design was applied to preliminary study the statistically significant effects of five process variables. In this study one influential process variable is integer. From the experimental results, the new operating condition from the algorithm was superior when compared to the current manufacturing condition.

Keywords: stealth laser dicing process, meandering, meta-heuristics, shuffled frog leaping algorithm

Procedia PDF Downloads 341
15829 A Gamification Teaching Method for Software Measurement Process

Authors: Lennon Furtado, Sandro Oliveira

Abstract:

The importance of an effective measurement program lies in the ability to control and predict what can be measured. Thus, the measurement program has the capacity to provide bases in decision-making to support the interests of an organization. Therefore, it is only possible to apply for an effective measurement program with a team of software engineers well trained in the measurement area. However, the literature indicates that are few computer science courses that have in their program the teaching of the software measurement process. And even these, generally present only basic theoretical concepts of said process and little or no measurement in practice, which results in the student's lack of motivation to learn the measurement process. In this context, according to some experts in software process improvements, one of the most used approaches to maintaining the motivation and commitment to software process improvements program is the use of the gamification. Therefore, this paper aims to present a proposal of teaching the measurement process by gamification. Which seeks to improve student motivation and performance in the assimilation of tasks related to software measurement, by incorporating elements of games into the practice of measurement process, making it more attractive for learning. And as a way of validating the proposal will be made a comparison between two distinct groups of 20 students of Software Quality class, a control group, and an experiment group. The control group will be the students that will not make use of the gamification proposal to learn software measurement process, while the experiment group, will be the students that will make use of the gamification proposal to learn software measurement process. Thus, this paper will analyze the objective and subjective results of each group. And as objective result will be analyzed the student grade reached at the end of the course, and as subjective results will be analyzed a post-course questionnaire with the opinion of each student about the teaching method. Finally, this paper aims to prove or refute the following hypothesis: If the gamification proposal to teach software measurement process does appropriate motivate the student, in order to attribute the necessary competence to the practical application of the measurement process.

Keywords: education, gamification, software measurement process, software engineering

Procedia PDF Downloads 314
15828 Review on Optimization of Drinking Water Treatment Process

Authors: M. Farhaoui, M. Derraz

Abstract:

In the drinking water treatment processes, the optimization of the treatment is an issue of particular concern. In general, the process consists of many units as settling, coagulation, flocculation, sedimentation, filtration and disinfection. The optimization of the process consists of some measures to decrease the managing and monitoring expenses and improve the quality of the produced water. The objective of this study is to provide water treatment operators with methods and practices that enable to attain the most effective use of the facility and, in consequence, optimize the of the cubic meter price of the treated water. This paper proposes a review on optimization of drinking water treatment process by analyzing all of the water treatment units and gives some solutions in order to maximize the water treatment performances without compromising the water quality standards. Some solutions and methods are performed in the water treatment plant located in the middle of Morocco (Meknes).

Keywords: coagulation process, optimization, turbidity removal, water treatment

Procedia PDF Downloads 422
15827 Fault-Tolerant Control Study and Classification: Case Study of a Hydraulic-Press Model Simulated in Real-Time

Authors: Jorge Rodriguez-Guerra, Carlos Calleja, Aron Pujana, Iker Elorza, Ana Maria Macarulla

Abstract:

Society demands more reliable manufacturing processes capable of producing high quality products in shorter production cycles. New control algorithms have been studied to satisfy this paradigm, in which Fault-Tolerant Control (FTC) plays a significant role. It is suitable to detect, isolate and adapt a system when a harmful or faulty situation appears. In this paper, a general overview about FTC characteristics are exposed; highlighting the properties a system must ensure to be considered faultless. In addition, a research to identify which are the main FTC techniques and a classification based on their characteristics is presented in two main groups: Active Fault-Tolerant Controllers (AFTCs) and Passive Fault-Tolerant Controllers (PFTCs). AFTC encompasses the techniques capable of re-configuring the process control algorithm after the fault has been detected, while PFTC comprehends the algorithms robust enough to bypass the fault without further modifications. The mentioned re-configuration requires two stages, one focused on detection, isolation and identification of the fault source and the other one in charge of re-designing the control algorithm by two approaches: fault accommodation and control re-design. From the algorithms studied, one has been selected and applied to a case study based on an industrial hydraulic-press. The developed model has been embedded under a real-time validation platform, which allows testing the FTC algorithms and analyse how the system will respond when a fault arises in similar conditions as a machine will have on factory. One AFTC approach has been picked up as the methodology the system will follow in the fault recovery process. In a first instance, the fault will be detected, isolated and identified by means of a neural network. In a second instance, the control algorithm will be re-configured to overcome the fault and continue working without human interaction.

Keywords: fault-tolerant control, electro-hydraulic actuator, fault detection and isolation, control re-design, real-time

Procedia PDF Downloads 177
15826 Exploring Data Leakage in EEG Based Brain-Computer Interfaces: Overfitting Challenges

Authors: Khalida Douibi, Rodrigo Balp, Solène Le Bars

Abstract:

In the medical field, applications related to human experiments are frequently linked to reduced samples size, which makes the training of machine learning models quite sensitive and therefore not very robust nor generalizable. This is notably the case in Brain-Computer Interface (BCI) studies, where the sample size rarely exceeds 20 subjects or a few number of trials. To address this problem, several resampling approaches are often used during the data preparation phase, which is an overly critical step in a data science analysis process. One of the naive approaches that is usually applied by data scientists consists in the transformation of the entire database before the resampling phase. However, this can cause model’ s performance to be incorrectly estimated when making predictions on unseen data. In this paper, we explored the effect of data leakage observed during our BCI experiments for device control through the real-time classification of SSVEPs (Steady State Visually Evoked Potentials). We also studied potential ways to ensure optimal validation of the classifiers during the calibration phase to avoid overfitting. The results show that the scaling step is crucial for some algorithms, and it should be applied after the resampling phase to avoid data leackage and improve results.

Keywords: data leackage, data science, machine learning, SSVEP, BCI, overfitting

Procedia PDF Downloads 153
15825 A Unique Exact Approach to Handle a Time-Delayed State-Space System: The Extraction of Juice Process

Authors: Mohamed T. Faheem Saidahmed, Ahmed M. Attiya Ibrahim, Basma GH. Elkilany

Abstract:

This paper discusses the application of Time Delay Control (TDC) compensation technique in the juice extraction process in a sugar mill. The objective is to improve the control performance of the process and increase extraction efficiency. The paper presents the mathematical model of the juice extraction process and the design of the TDC compensation controller. Simulation results show that the TDC compensation technique can effectively suppress the time delay effect in the process and improve control performance. The extraction efficiency is also significantly increased with the application of the TDC compensation technique. The proposed approach provides a practical solution for improving the juice extraction process in sugar mills using MATLAB Processes.

Keywords: time delay control (TDC), exact and unique state space model, delay compensation, Smith predictor.

Procedia PDF Downloads 92
15824 In situ Real-Time Multivariate Analysis of Methanolysis Monitoring of Sunflower Oil Using FTIR

Authors: Pascal Mwenge, Tumisang Seodigeng

Abstract:

The combination of world population and the third industrial revolution led to high demand for fuels. On the other hand, the decrease of global fossil 8fuels deposits and the environmental air pollution caused by these fuels has compounded the challenges the world faces due to its need for energy. Therefore, new forms of environmentally friendly and renewable fuels such as biodiesel are needed. The primary analytical techniques for methanolysis yield monitoring have been chromatography and spectroscopy, these methods have been proven reliable but are more demanding, costly and do not provide real-time monitoring. In this work, the in situ monitoring of biodiesel from sunflower oil using FTIR (Fourier Transform Infrared) has been studied; the study was performed using EasyMax Mettler Toledo reactor equipped with a DiComp (Diamond) probe. The quantitative monitoring of methanolysis was performed by building a quantitative model with multivariate calibration using iC Quant module from iC IR 7.0 software. 15 samples of known concentrations were used for the modelling which were taken in duplicate for model calibration and cross-validation, data were pre-processed using mean centering and variance scale, spectrum math square root and solvent subtraction. These pre-processing methods improved the performance indexes from 7.98 to 0.0096, 11.2 to 3.41, 6.32 to 2.72, 0.9416 to 0.9999, RMSEC, RMSECV, RMSEP and R2Cum, respectively. The R2 value of 1 (training), 0.9918 (test), 0.9946 (cross-validation) indicated the fitness of the model built. The model was tested against univariate model; small discrepancies were observed at low concentration due to unmodelled intermediates but were quite close at concentrations above 18%. The software eliminated the complexity of the Partial Least Square (PLS) chemometrics. It was concluded that the model obtained could be used to monitor methanol of sunflower oil at industrial and lab scale.

Keywords: biodiesel, calibration, chemometrics, methanolysis, multivariate analysis, transesterification, FTIR

Procedia PDF Downloads 148
15823 Translation and Validation of the Thai Version of the Japanese Sleep Questionnaire for Preschoolers

Authors: Natcha Lueangapapong, Chariya Chuthapisith, Lunliya Thampratankul

Abstract:

Background: There is a need to find an appropriate tool to help healthcare providers determine sleep problems in children for early diagnosis and management. The Japanese Sleep Questionnaire for Preschoolers (JSQ-P) is a parent-reported sleep questionnaire that has good psychometric properties and can be used in the context of Asian culture, which is likely suitable for Thai children. Objectives: This study aimed to translate and validate the Japanese Sleep Questionnaire for Preschoolers (JSQ-P) into a Thai version and to evaluate factors associated with sleep disorders in preschoolers. Methods: After approval by the original developer, the cross-cultural adaptation process of JSQ-P was performed, including forward translation, reconciliation, backward translation, and final approval of the Thai version of JSQ-P (TH-JSQ-P) by the original creator. This study was conducted between March 2021 and February 2022. The TH-JSQ-P was completed by 2,613 guardians whose children were aged 2-6 years twice in 10-14 days to assess its reliability and validity. Content validity was measured by an index of item-objective congruence (IOC) and a content validity index (CVI). Face validity, content validity, structural validity, construct validity (discriminant validity), criterion validity and predictive validity were assessed. The sensitivity and specificity of the TH-JSQ-P were also measured by using a total JSQ-P score cutoff point 84, recommended by the original JSQ-P and each subscale score among the clinical samples of obstructive sleep apnea syndrome. Results: Internal consistency reliability, evaluated by Cronbach’s α coefficient, showed acceptable reliability in all subscales of JSQ-P. It also had good test-retest reliability, as the intraclass correlation coefficient (ICC) for all items ranged between 0.42-0.84. The content validity was acceptable. For structural validity, our results indicated that the final factor solution for the Th-JSQ-P was comparable to the original JSQ-P. For construct validity, age group was one of the clinical parameters associated with some sleep problems. In detail, parasomnias, insomnia, daytime excessive sleepiness and sleep habits significantly decreased when the children got older; on the other hand, insufficient sleep was significantly increased with age. For criterion validity, all subscales showed a correlation with the Epworth Sleepiness Scale (r = -0.049-0.349). In predictive validity, the Epworth Sleepiness Scale was significantly a strong factor that correlated to sleep problems in all subscales of JSQ-P except in the subscale of sleep habit. The sensitivity and specificity of the total JSQ-P score were 0.72 and 0.66, respectively. Conclusion: The Thai version of JSQ-P has good internal consistency reliability and test-retest reliability. It passed 6 validity tests, and this can be used to evaluate sleep problems in preschool children in Thailand. Furthermore, it has satisfactory general psychometric properties and good reliability and validity. The data collected in examining the sensitivity of the Thai version revealed that the JSQ-P could detect differences in sleep problems among children with obstructive sleep apnea syndrome. This confirmed that the measure is sensitive and can be used to discriminate sleep problems among different children.

Keywords: preschooler, questionnaire, validation, Thai version

Procedia PDF Downloads 104
15822 Like Making an Ancient Urn: Metaphor Conceptualization of L2 Writing

Authors: Muhalim Muhalim

Abstract:

Drawing on Lakoff’s theory of metaphor conceptualization, this article explores the conceptualization of language two writing (L2W) of ten students-teachers in Indonesia via metaphors. The ten postgraduate English language teaching students and at the same time (former) English teachers received seven days of intervention in teaching and learning L2. Using introspective log and focus group discussion, the results illuminate us that all participants are unanimous on perceiving L2W as process-oriented rather than product-oriented activity. Specifically, the metaphor conceptualizations exhibit three categories of process-oriented L2W: deliberate process, learning process, and problem-solving process. However, it has to be clarified from the outset that this categorization is not rigid because some of the properties of metaphors might belong to other categories. Results of the study and implications for English language teaching will be further discussed.

Keywords: metaphor conceptualisation, second language, learning writing, teaching writing

Procedia PDF Downloads 413
15821 Skid-mounted Gathering System Hydrate Control And Process Simulation Optimization

Authors: Di Han, Lingfeng Li, Peixue Zhang, Yuzhuo Zhang

Abstract:

Since natural gas extracted from the wellhead of a gas well, after passing through the throttle valve, causes a rapid decrease in temperature along with a decrease in pressure, which creates conditions for hydrate generation. In order to solve the problem of hydrate generation in the process of wellhead gathering, effective measures should be taken to prevent hydrate generation. In this paper, we firstly introduce the principle of natural gas throttling temperature drop and the theoretical basis of hydrate inhibitor injection calculation, and then use HYSYS software to simulate and calculate the three processes and determine the key process parameters. The hydrate control process applicable to the skid design of natural gas wellhead gathering skids was determined by comparing the hydrate control effect, energy consumption of key equipment and process adaptability.

Keywords: natural gas, hydrate control, skid design, HYSYS

Procedia PDF Downloads 91
15820 Multiloop Fractional Order PID Controller Tuned Using Cuckoo Algorithm for Two Interacting Conical Tank Process

Authors: U. Sabura Banu, S. K. Lakshmanaprabu

Abstract:

The improvement of meta-heuristic algorithm encourages control engineer to design an optimal controller for industrial process. Most real-world industrial processes are non-linear multivariable process with high interaction. Even in sub-process unit, thousands of loops are available mostly interacting in nature. Optimal controller design for such process are still challenging task. Closed loop controller design by multiloop PID involves a tedious procedure by performing interaction study and then PID auto-tuning the loop with higher interaction. Finally, detuning the controller to accommodate the effects of the other process variables. Fractional order PID controllers are replacing integer order PID controllers recently. Design of Multiloop Fractional Order (MFO) PID controller is still more complicated. Cuckoo algorithm, a swarm intelligence technique is used to optimally tune the MFO PID controller with easiness minimizing Integral Time Absolute Error. The closed loop performance is tested under servo, regulatory and servo-regulatory conditions.

Keywords: Cuckoo algorithm, mutliloop fractional order PID controller, two Interacting conical tank process

Procedia PDF Downloads 498
15819 Social Networks in a Communication Strategy of a Large Company

Authors: Kherbache Mehdi

Abstract:

Within the framework of the validation of the Master in business administration marketing and sales in INSIM institute international in management Blida, we get the opportunity to do a professional internship in Sonelgaz Enterprise and a thesis. The thesis deals with the integration of social networking in the communication strategy of a company. The problematic is: How communicate with social network can be a solution for companies? The challenges stressed by this thesis were to suggest limits and recommendations to Sonelgaz Enterprise concerning social networks. The whole social networks represent more than a billion people as a potential target for the companies. Thanks to research and a qualitative approach, we have identified tree valid hypothesis. The first hypothesis allows confirming that using social networks cannot be ignored by any company in its communication strategy. However, the second hypothesis demonstrates that it’s necessary to prepare a strategy that integrates social networks in the communication plan of the company. The risk of this strategy is very limited because failure on social networks is not a restraint for the enterprise, social networking is not expensive and, a bad image which could result from it is not as important in the long-term. Furthermore, the return on investment is difficult to evaluate. Finally, the last hypothesis shows that firms establish a new relation between consumers and brands thanks to the proximity allowed by social networks. After the validation of the hypothesis, we suggested some recommendations to Sonelgaz Enterprise regarding the communication through social networks. Firstly, the company must use the interactivity of social network in order to have fruitful exchanges with the community. We also recommended having a strategy to treat negative comments. The company must also suggest delivering resources to the community thanks to a community manager, in order to have a good relation with the community. Furthermore, we advised using social networks to do business intelligence. Sonelgaz Enterprise can have some creative and interactive contents with some amazing applications on Facebook for example. Finally, we recommended to the company to be not intrusive with “fans” or “followers” and to be open to all the platforms: Twitter, Facebook, Linked-In for example.

Keywords: social network, buzz, communication, consumer, return on investment, internet users, web 2.0, Facebook, Twitter, interaction

Procedia PDF Downloads 422
15818 Using Mind Mapping and Morphological Analysis within a New Methodology for Teaching Students of Products’ Design

Authors: Kareem Saber

Abstract:

Many products’ design instructors search for how to help students to develop their designs simply by reducing design stages and extrapolating simple design process forms to achieve design creativity. So, the researcher extrapolated a new design process form called “hierarchical design” which reduced design process into three stages and he had tried that methodology on about two hundred students. That trial had led to great results as students could develop their designs which characterized by creativity and innovation. That proved the success and effectiveness of the proposed methodology.

Keywords: mind mapping, morphological analysis, product design, design process

Procedia PDF Downloads 178
15817 In Silico Exploration of Quinazoline Derivatives as EGFR Inhibitors for Lung Cancer: A Multi-Modal Approach Integrating QSAR-3D, ADMET, Molecular Docking, and Molecular Dynamics Analyses

Authors: Mohamed Moussaoui

Abstract:

A series of thirty-one potential inhibitors targeting the epidermal growth factor receptor kinase (EGFR), derived from quinazoline, underwent 3D-QSAR analysis using CoMFA and CoMSIA methodologies. The training and test sets of quinazoline derivatives were utilized to construct and validate the QSAR models, respectively, with dataset alignment performed using the lowest energy conformer of the most active compound. The best-performing CoMFA and CoMSIA models demonstrated impressive determination coefficients, with R² values of 0.981 and 0.978, respectively, and Leave One Out cross-validation determination coefficients, Q², of 0.645 and 0.729, respectively. Furthermore, external validation using a test set of five compounds yielded predicted determination coefficients, R² test, of 0.929 and 0.909 for CoMFA and CoMSIA, respectively. Building upon these promising results, eighteen new compounds were designed and assessed for drug likeness and ADMET properties through in silico methods. Additionally, molecular docking studies were conducted to elucidate the binding interactions between the selected compounds and the enzyme. Detailed molecular dynamics simulations were performed to analyze the stability, conformational changes, and binding interactions of the quinazoline derivatives with the EGFR kinase. These simulations provided deeper insights into the dynamic behavior of the compounds within the active site. This comprehensive analysis enhances the understanding of quinazoline derivatives as potential anti-cancer agents and provides valuable insights for lead optimization in the early stages of drug discovery, particularly for developing highly potent anticancer therapeutics

Keywords: 3D-QSAR, CoMFA, CoMSIA, ADMET, molecular docking, quinazoline, molecular dynamic, egfr inhibitors, lung cancer, anticancer

Procedia PDF Downloads 48
15816 Towards an Understanding of Breaking and Coalescence Process in Bitumen Emulsions

Authors: Abdullah Khan, Per Redelius, Nicole Kringos

Abstract:

The breaking and coalescence process in bitumen emulsion strongly influence the performance of the cold mix asphalt (CMA) and this phase separation process is affected by the physio-chemical changes happening at the bitumen/water interface. In this paper, coalescence experiments of two bitumen droplets in an emulsion environment have been carried out by a newly developed test procedure. In this study, different types of emulsifiers were selected to understand the coalescence process with respect to changes in the water phase surface tension due to addition of different surfactants and other additives such as salts. The research showed that the relaxation kinetics of bitumen droplets varied with the type of emulsifier, its concentration as well as with and without presence of salt in the water phase. Moreover, kinetics of the coalescence process was also investigated with the temperature variation.

Keywords: bitumen emulsions, breaking and coalescence, cold mix asphalt, emulsifiers, relaxation, salts

Procedia PDF Downloads 338
15815 Compare Hot Forming and Cold Forming in Rolling Process

Authors: Ali Moarrefzadeh

Abstract:

In metalworking, rolling is a metal forming process in which metal stock is passed through a pair of rolls. Rolling is classified according to the temperature of the metal rolled. If the temperature of the metal is above its recrystallization temperature, then the process is termed as hot rolling. If the temperature of the metal is below its recrystallization temperature, the process is termed as cold rolling. In terms of usage, hot rolling processes more tonnage than any other manufacturing process, and cold rolling processes the most tonnage out of all cold working processes. This article describes the use of advanced tubing inspection NDT methods for boiler and heat exchanger equipment in the petrochemical industry to supplement major turnaround inspections. The methods presented include remote field eddy current, magnetic flux leakage, internal rotary inspection system and eddy current.

Keywords: hot forming, cold forming, metal, rolling, simulation

Procedia PDF Downloads 529
15814 Constructing a Semi-Supervised Model for Network Intrusion Detection

Authors: Tigabu Dagne Akal

Abstract:

While advances in computer and communications technology have made the network ubiquitous, they have also rendered networked systems vulnerable to malicious attacks devised from a distance. These attacks or intrusions start with attackers infiltrating a network through a vulnerable host and then launching further attacks on the local network or Intranet. Nowadays, system administrators and network professionals can attempt to prevent such attacks by developing intrusion detection tools and systems using data mining technology. In this study, the experiments were conducted following the Knowledge Discovery in Database Process Model. The Knowledge Discovery in Database Process Model starts from selection of the datasets. The dataset used in this study has been taken from Massachusetts Institute of Technology Lincoln Laboratory. After taking the data, it has been pre-processed. The major pre-processing activities include fill in missed values, remove outliers; resolve inconsistencies, integration of data that contains both labelled and unlabelled datasets, dimensionality reduction, size reduction and data transformation activity like discretization tasks were done for this study. A total of 21,533 intrusion records are used for training the models. For validating the performance of the selected model a separate 3,397 records are used as a testing set. For building a predictive model for intrusion detection J48 decision tree and the Naïve Bayes algorithms have been tested as a classification approach for both with and without feature selection approaches. The model that was created using 10-fold cross validation using the J48 decision tree algorithm with the default parameter values showed the best classification accuracy. The model has a prediction accuracy of 96.11% on the training datasets and 93.2% on the test dataset to classify the new instances as normal, DOS, U2R, R2L and probe classes. The findings of this study have shown that the data mining methods generates interesting rules that are crucial for intrusion detection and prevention in the networking industry. Future research directions are forwarded to come up an applicable system in the area of the study.

Keywords: intrusion detection, data mining, computer science, data mining

Procedia PDF Downloads 296
15813 A Conceptual Design of Freeze Desalination Using Low Cost Refrigeration

Authors: Parul Sahu

Abstract:

In recent years, seawater desalination has been emerged as a potential resource to circumvent water scarcity, especially in coastal regions. Among the various methods, thermal evaporation or distillation and membrane operations like Reverse Osmosis (RO) has been exploited at commercial scale. However, the energy cost and maintenance expenses associated with these processes remain high. In this context Freeze Desalination (FD), subjected to the availability of low cost refrigeration, offers an exciting alternative. Liquefied Natural Gas (LNG) regasification terminals provide an opportunity to utilize the refrigeration available with regasification of LNG. This work presents the conceptualization and development of a process scheme integrating the ice and hydrate based FD to the LNG regasification process. This integration overcomes the high energy demand associated with FD processes by utilizing the refrigeration associated with LNG regasification. An optimal process scheme was obtained by performing process simulation using ASPEN PLUS simulator. The results indicated the new proposed process requires only 1 kWh/m³ of energy with the utilization of maximum refrigeration. In addition, a sensitivity analysis was also performed to study the effect of various process parameters on water recovery and energy consumption for the proposed process. The results show that the energy consumption decreases by 30% with an increase in water recovery from 30% to 60%. However, due to operational limitations associated with ice and hydrate handling in seawater, the water recovery cannot be maximized but optimized. The proposed process can be potentially used to desalinate seawater in integration with LNG regasification terminal.

Keywords: freeze desalination, liquefied natural gas regasification, process simulation, refrigeration

Procedia PDF Downloads 131
15812 Centralizing the Teaching Process in Intelligent Tutoring System Architectures

Authors: Nikolaj Troels Graf Von Malotky, Robin Nicolay, Alke Martens

Abstract:

There exist a plethora of architectures for ITSs (Intelligent Tutoring Systems). A thorough analysis and comparison of the architectures revealed, that in most cases the architecture extensions are evolutionary grown, reflecting state of the art trends of each decade. However, from the perspective of software engineering, the main aspect of an ITS has not been reflected in any of these architectures, yet. From the perspective of cognitive research, the construction of the teaching process is what makes an ITS 'intelligent' regarding the spectrum of interaction with the students. Thus, in our approach, we focus on a behavior based architecture, which is based on the main teaching processes. To create a new general architecture for ITS, we have to define the prerequisites. This paper analyzes the current state of the existing architectures and derives rules for the behavior of ITS. It is presenting a teaching process for ITSs to be used together with the architecture.

Keywords: intelligent tutoring, ITS, tutoring process, system architecture, interaction process

Procedia PDF Downloads 384
15811 Welding Process Selection for Storage Tank by Integrated Data Envelopment Analysis and Fuzzy Credibility Constrained Programming Approach

Authors: Rahmad Wisnu Wardana, Eakachai Warinsiriruk, Sutep Joy-A-Ka

Abstract:

Selecting the most suitable welding process usually depends on experiences or common application in similar companies. However, this approach generally ignores many criteria that can be affecting the suitable welding process selection. Therefore, knowledge automation through knowledge-based systems will significantly improve the decision-making process. The aims of this research propose integrated data envelopment analysis (DEA) and fuzzy credibility constrained programming approach for identifying the best welding process for stainless steel storage tank in the food and beverage industry. The proposed approach uses fuzzy concept and credibility measure to deal with uncertain data from experts' judgment. Furthermore, 12 parameters are used to determine the most appropriate welding processes among six competitive welding processes.

Keywords: welding process selection, data envelopment analysis, fuzzy credibility constrained programming, storage tank

Procedia PDF Downloads 167
15810 Value in Exchange: The Importance of Users Interaction as the Center of User Experiences

Authors: Ramlan Jantan, Norfadilah Kamaruddin, Shahriman Zainal Abidin

Abstract:

In this era of technology, the co-creation method has become a new development trend. In this light, most design businesses have currently transformed their development strategy from being goods-dominant into service-dominant where more attention is given to the end-users and their roles in the development process. As a result, the conventional development process has been replaced with a more cooperative one. Consequently, numerous studies have been conducted to explore the extension of co-creation method in the design development process and most studies have focused on issues found during the production process. In the meantime, this study aims to investigate potential values established during the pre-production process, which is also known as the ‘circumstances value creation’. User involvement is questioned and crucially debate at the entry level of pre-production process in value in-exchange jointly spheres; thus user experiences took place. Thus, this paper proposed a potential framework of the co-creation method for Malaysian interactive product development. The framework is formulated from both parties involved: the users and designers. The framework will clearly give an explanation of the value of the co-creation method, and it could assist relevant design industries/companies in developing a blueprint for the design process. This paper further contributes to the literature on the co-creation of value and digital ecosystems.

Keywords: co-creation method, co-creation framework, co-creation, co-production

Procedia PDF Downloads 178