Search results for: Verification and Validation (V&V)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1836

Search results for: Verification and Validation (V&V)

1506 Social Information Seeking: Studying the Effect of Question Type on Responses in Social Q&A Sites

Authors: Arshia Ayoub, Zahid Ashraf Wani

Abstract:

With the introduction of online social Q&A sites, people are able to reach each other efficiently for information seeking and simultaneously creating social bonds. There prevails an issue of low or no response for some questions posed by an information seeker on these sites. So this study tries to understand the effect of question type on responses in Social Q & A sites. The study found that among the answered queries, majority of them were answered within 24 hours of posting the questions and surprisingly most replies were received within one hour of posting. It was observed that questions of general information type were most likely to be answered followed by verification type.

Keywords: community‐based services, information seeking, social search, social Q&A site

Procedia PDF Downloads 165
1505 Implications of Optimisation Algorithm on the Forecast Performance of Artificial Neural Network for Streamflow Modelling

Authors: Martins Y. Otache, John J. Musa, Abayomi I. Kuti, Mustapha Mohammed

Abstract:

The performance of an artificial neural network (ANN) is contingent on a host of factors, for instance, the network optimisation scheme. In view of this, the study examined the general implications of the ANN training optimisation algorithm on its forecast performance. To this end, the Bayesian regularisation (Br), Levenberg-Marquardt (LM), and the adaptive learning gradient descent: GDM (with momentum) algorithms were employed under different ANN structural configurations: (1) single-hidden layer, and (2) double-hidden layer feedforward back propagation network. Results obtained revealed generally that the gradient descent with momentum (GDM) optimisation algorithm, with its adaptive learning capability, used a relatively shorter time in both training and validation phases as compared to the Levenberg- Marquardt (LM) and Bayesian Regularisation (Br) algorithms though learning may not be consummated; i.e., in all instances considering also the prediction of extreme flow conditions for 1-day and 5-day ahead, respectively especially using the ANN model. In specific statistical terms on the average, model performance efficiency using the coefficient of efficiency (CE) statistic were Br: 98%, 94%; LM: 98 %, 95 %, and GDM: 96 %, 96% respectively for training and validation phases. However, on the basis of relative error distribution statistics (MAE, MAPE, and MSRE), GDM performed better than the others overall. Based on the findings, it is imperative to state that the adoption of ANN for real-time forecasting should employ training algorithms that do not have computational overhead like the case of LM that requires the computation of the Hessian matrix, protracted time, and sensitivity to initial conditions; to this end, Br and other forms of the gradient descent with momentum should be adopted considering overall time expenditure and quality of the forecast as well as mitigation of network overfitting. On the whole, it is recommended that evaluation should consider implications of (i) data quality and quantity and (ii) transfer functions on the overall network forecast performance.

Keywords: streamflow, neural network, optimisation, algorithm

Procedia PDF Downloads 144
1504 Development and Total Error Concept Validation of Common Analytical Method for Quantification of All Residual Solvents Present in Amino Acids by Gas Chromatography-Head Space

Authors: A. Ramachandra Reddy, V. Murugan, Prema Kumari

Abstract:

Residual solvents in Pharmaceutical samples are monitored using gas chromatography with headspace (GC-HS). Based on current regulatory and compendial requirements, measuring the residual solvents are mandatory for all release testing of active pharmaceutical ingredients (API). Generally, isopropyl alcohol is used as the residual solvent in proline and tryptophan; methanol in cysteine monohydrate hydrochloride, glycine, methionine and serine; ethanol in glycine and lysine monohydrate; acetic acid in methionine. In order to have a single method for determining these residual solvents (isopropyl alcohol, ethanol, methanol and acetic acid) in all these 7 amino acids a sensitive and simple method was developed by using gas chromatography headspace technique with flame ionization detection. During development, no reproducibility, retention time variation and bad peak shape of acetic acid peaks were identified due to the reaction of acetic acid with the stationary phase (cyanopropyl dimethyl polysiloxane phase) of column and dissociation of acetic acid with water (if diluent) while applying temperature gradient. Therefore, dimethyl sulfoxide was used as diluent to avoid these issues. But most the methods published for acetic acid quantification by GC-HS uses derivatisation technique to protect acetic acid. As per compendia, risk-based approach was selected as appropriate to determine the degree and extent of the validation process to assure the fitness of the procedure. Therefore, Total error concept was selected to validate the analytical procedure. An accuracy profile of ±40% was selected for lower level (quantitation limit level) and for other levels ±30% with 95% confidence interval (risk profile 5%). The method was developed using DB-Waxetr column manufactured by Agilent contains 530 µm internal diameter, thickness: 2.0 µm, and length: 30 m. A constant flow of 6.0 mL/min. with constant make up mode of Helium gas was selected as a carrier gas. The present method is simple, rapid, and accurate, which is suitable for rapid analysis of isopropyl alcohol, ethanol, methanol and acetic acid in amino acids. The range of the method for isopropyl alcohol is 50ppm to 200ppm, ethanol is 50ppm to 3000ppm, methanol is 50ppm to 400ppm and acetic acid 100ppm to 400ppm, which covers the specification limits provided in European pharmacopeia. The accuracy profile and risk profile generated as part of validation were found to be satisfactory. Therefore, this method can be used for testing of residual solvents in amino acids drug substances.

Keywords: amino acid, head space, gas chromatography, total error

Procedia PDF Downloads 135
1503 Comparison of Different Artificial Intelligence-Based Protein Secondary Structure Prediction Methods

Authors: Jamerson Felipe Pereira Lima, Jeane Cecília Bezerra de Melo

Abstract:

The difficulty and cost related to obtaining of protein tertiary structure information through experimental methods, such as X-ray crystallography or NMR spectroscopy, helped raising the development of computational methods to do so. An approach used in these last is prediction of tridimensional structure based in the residue chain, however, this has been proved an NP-hard problem, due to the complexity of this process, explained by the Levinthal paradox. An alternative solution is the prediction of intermediary structures, such as the secondary structure of the protein. Artificial Intelligence methods, such as Bayesian statistics, artificial neural networks (ANN), support vector machines (SVM), among others, were used to predict protein secondary structure. Due to its good results, artificial neural networks have been used as a standard method to predict protein secondary structure. Recent published methods that use this technique, in general, achieved a Q3 accuracy between 75% and 83%, whereas the theoretical accuracy limit for protein prediction is 88%. Alternatively, to achieve better results, support vector machines prediction methods have been developed. The statistical evaluation of methods that use different AI techniques, such as ANNs and SVMs, for example, is not a trivial problem, since different training sets, validation techniques, as well as other variables can influence the behavior of a prediction method. In this study, we propose a prediction method based on artificial neural networks, which is then compared with a selected SVM method. The chosen SVM protein secondary structure prediction method is the one proposed by Huang in his work Extracting Physico chemical Features to Predict Protein Secondary Structure (2013). The developed ANN method has the same training and testing process that was used by Huang to validate his method, which comprises the use of the CB513 protein data set and three-fold cross-validation, so that the comparative analysis of the results can be made comparing directly the statistical results of each method.

Keywords: artificial neural networks, protein secondary structure, protein structure prediction, support vector machines

Procedia PDF Downloads 605
1502 Experimental Verification of the Relationship between Physiological Indexes and the Presence or Absence of an Operation during E-learning

Authors: Masaki Omata, Shumma Hosokawa

Abstract:

An experiment to verify the relationships between physiological indexes of an e-learner and the presence or absence of an operation during e-learning is described. Electroencephalogram (EEG), hemoencephalography (HEG), skin conductance (SC), and blood volume pulse (BVP) values were measured while participants performed experimental learning tasks. The results show that there are significant differences between the SC values when reading with clicking on learning materials and the SC values when reading without clicking, and between the HEG ratio when reading (with and without clicking) and the HEG ratio when resting for four of five participants. We conclude that the SC signals can be used to estimate whether or not a learner is performing an active task and that the HEG ratios can be used to estimate whether a learner is learning.

Keywords: e-learning, physiological index, physiological signal, state of learning

Procedia PDF Downloads 370
1501 Biimodal Biometrics System Using Fusion of Iris and Fingerprint

Authors: Attallah Bilal, Hendel Fatiha

Abstract:

This paper proposes the bimodal biometrics system for identity verification iris and fingerprint, at matching score level architecture using weighted sum of score technique. The features are extracted from the pre processed images of iris and fingerprint. These features of a query image are compared with those of a database image to obtain matching scores. The individual scores generated after matching are passed to the fusion module. This module consists of three major steps i.e., normalization, generation of similarity score and fusion of weighted scores. The final score is then used to declare the person as genuine or an impostor. The system is tested on CASIA database and gives an overall accuracy of 91.04% with FAR of 2.58% and FRR of 8.34%.

Keywords: iris, fingerprint, sum rule, fusion

Procedia PDF Downloads 349
1500 Performance Comparison and Visualization of COMSOL Multiphysics, Matlab, and Fortran for Predicting the Reservoir Pressure on Oil Production in a Multiple Leases Reservoir with Boundary Element Method

Authors: N. Alias, W. Z. W. Muhammad, M. N. M. Ibrahim, M. Mohamed, H. F. S. Saipol, U. N. Z. Ariffin, N. A. Zakaria, M. S. Z. Suardi

Abstract:

This paper presents the performance comparison of some computation software for solving the boundary element method (BEM). BEM formulation is the numerical technique and high potential for solving the advance mathematical modeling to predict the production of oil well in arbitrarily shaped based on multiple leases reservoir. The limitation of data validation for ensuring that a program meets the accuracy of the mathematical modeling is considered as the research motivation of this paper. Thus, based on this limitation, there are three steps involved to validate the accuracy of the oil production simulation process. In the first step, identify the mathematical modeling based on partial differential equation (PDE) with Poisson-elliptic type to perform the BEM discretization. In the second step, implement the simulation of the 2D BEM discretization using COMSOL Multiphysic and MATLAB programming languages. In the last step, analyze the numerical performance indicators for both programming languages by using the validation of Fortran programming. The performance comparisons of numerical analysis are investigated in terms of percentage error, comparison graph and 2D visualization of pressure on oil production of multiple leases reservoir. According to the performance comparison, the structured programming in Fortran programming is the alternative software for implementing the accurate numerical simulation of BEM. As a conclusion, high-level language for numerical computation and numerical performance evaluation are satisfied to prove that Fortran is well suited for capturing the visualization of the production of oil well in arbitrarily shaped.

Keywords: performance comparison, 2D visualization, COMSOL multiphysic, MATLAB, Fortran, modelling and simulation, boundary element method, reservoir pressure

Procedia PDF Downloads 480
1499 In situ Real-Time Multivariate Analysis of Methanolysis Monitoring of Sunflower Oil Using FTIR

Authors: Pascal Mwenge, Tumisang Seodigeng

Abstract:

The combination of world population and the third industrial revolution led to high demand for fuels. On the other hand, the decrease of global fossil 8fuels deposits and the environmental air pollution caused by these fuels has compounded the challenges the world faces due to its need for energy. Therefore, new forms of environmentally friendly and renewable fuels such as biodiesel are needed. The primary analytical techniques for methanolysis yield monitoring have been chromatography and spectroscopy, these methods have been proven reliable but are more demanding, costly and do not provide real-time monitoring. In this work, the in situ monitoring of biodiesel from sunflower oil using FTIR (Fourier Transform Infrared) has been studied; the study was performed using EasyMax Mettler Toledo reactor equipped with a DiComp (Diamond) probe. The quantitative monitoring of methanolysis was performed by building a quantitative model with multivariate calibration using iC Quant module from iC IR 7.0 software. 15 samples of known concentrations were used for the modelling which were taken in duplicate for model calibration and cross-validation, data were pre-processed using mean centering and variance scale, spectrum math square root and solvent subtraction. These pre-processing methods improved the performance indexes from 7.98 to 0.0096, 11.2 to 3.41, 6.32 to 2.72, 0.9416 to 0.9999, RMSEC, RMSECV, RMSEP and R2Cum, respectively. The R2 value of 1 (training), 0.9918 (test), 0.9946 (cross-validation) indicated the fitness of the model built. The model was tested against univariate model; small discrepancies were observed at low concentration due to unmodelled intermediates but were quite close at concentrations above 18%. The software eliminated the complexity of the Partial Least Square (PLS) chemometrics. It was concluded that the model obtained could be used to monitor methanol of sunflower oil at industrial and lab scale.

Keywords: biodiesel, calibration, chemometrics, methanolysis, multivariate analysis, transesterification, FTIR

Procedia PDF Downloads 136
1498 3D Printed Multi-Modal Phantom Using Computed Tomography and 3D X-Ray Images

Authors: Sung-Suk Oh, Bong-Keun Kang, Sang-Wook Park, Hui-Jin Joo, Jong-Ryul Choi, Seong-Jun Lee, Jeong-Woo Sohn

Abstract:

The imaging phantom is utilized for the verification, evaluation and tuning of the medical imaging device and system. Although it could be costly, 3D printing is an ideal technique for a rapid, customized, multi-modal phantom making. In this article, we propose the multi-modal phantom using 3D printing. First of all, the Dicom images for were measured by CT (Computed Tomography) and 3D X-ray systems (PET/CT and Angio X-ray system of Siemens) and then were analyzed. Finally, the 3D modeling was processed using Dicom images. The 3D printed phantom was scanned by PET/CT and MRI systems and then evaluated.

Keywords: imaging phantom, MRI (Magnetic Resonance Imaging), PET / CT (Positron Emission Tomography / Computed Tomography), 3D printing

Procedia PDF Downloads 570
1497 Social Networks in a Communication Strategy of a Large Company

Authors: Kherbache Mehdi

Abstract:

Within the framework of the validation of the Master in business administration marketing and sales in INSIM institute international in management Blida, we get the opportunity to do a professional internship in Sonelgaz Enterprise and a thesis. The thesis deals with the integration of social networking in the communication strategy of a company. The problematic is: How communicate with social network can be a solution for companies? The challenges stressed by this thesis were to suggest limits and recommendations to Sonelgaz Enterprise concerning social networks. The whole social networks represent more than a billion people as a potential target for the companies. Thanks to research and a qualitative approach, we have identified tree valid hypothesis. The first hypothesis allows confirming that using social networks cannot be ignored by any company in its communication strategy. However, the second hypothesis demonstrates that it’s necessary to prepare a strategy that integrates social networks in the communication plan of the company. The risk of this strategy is very limited because failure on social networks is not a restraint for the enterprise, social networking is not expensive and, a bad image which could result from it is not as important in the long-term. Furthermore, the return on investment is difficult to evaluate. Finally, the last hypothesis shows that firms establish a new relation between consumers and brands thanks to the proximity allowed by social networks. After the validation of the hypothesis, we suggested some recommendations to Sonelgaz Enterprise regarding the communication through social networks. Firstly, the company must use the interactivity of social network in order to have fruitful exchanges with the community. We also recommended having a strategy to treat negative comments. The company must also suggest delivering resources to the community thanks to a community manager, in order to have a good relation with the community. Furthermore, we advised using social networks to do business intelligence. Sonelgaz Enterprise can have some creative and interactive contents with some amazing applications on Facebook for example. Finally, we recommended to the company to be not intrusive with “fans” or “followers” and to be open to all the platforms: Twitter, Facebook, Linked-In for example.

Keywords: social network, buzz, communication, consumer, return on investment, internet users, web 2.0, Facebook, Twitter, interaction

Procedia PDF Downloads 404
1496 Estimating Water Balance at Beterou Watershed, Benin Using Soil and Water Assessment Tool (SWAT) Model

Authors: Ella Sèdé Maforikan

Abstract:

Sustained water management requires quantitative information and the knowledge of spatiotemporal dynamics of hydrological system within the basin. This can be achieved through the research. Several studies have investigated both surface water and groundwater in Beterou catchment. However, there are few published papers on the application of the SWAT modeling in Beterou catchment. The objective of this study was to evaluate the performance of SWAT to simulate the water balance within the watershed. The inputs data consist of digital elevation model, land use maps, soil map, climatic data and discharge records. The model was calibrated and validated using the Sequential Uncertainty Fitting (SUFI2) approach. The calibrated started from 1989 to 2006 with four years warming up period (1985-1988); and validation was from 2007 to 2020. The goodness of the model was assessed using five indices, i.e., Nash–Sutcliffe efficiency (NSE), the ratio of the root means square error to the standard deviation of measured data (RSR), percent bias (PBIAS), the coefficient of determination (R²), and Kling Gupta efficiency (KGE). Results showed that SWAT model successfully simulated river flow in Beterou catchment with NSE = 0.79, R2 = 0.80 and KGE= 0.83 for the calibration process against validation process that provides NSE = 0.78, R2 = 0.78 and KGE= 0.85 using site-based streamflow data. The relative error (PBIAS) ranges from -12.2% to 3.1%. The parameters runoff curve number (CN2), Moist Bulk Density (SOL_BD), Base Flow Alpha Factor (ALPHA_BF), and the available water capacity of the soil layer (SOL_AWC) were the most sensitive parameter. The study provides further research with uncertainty analysis and recommendations for model improvement and provision of an efficient means to improve rainfall and discharges measurement data.

Keywords: watershed, water balance, SWAT modeling, Beterou

Procedia PDF Downloads 44
1495 In Silico Exploration of Quinazoline Derivatives as EGFR Inhibitors for Lung Cancer: A Multi-Modal Approach Integrating QSAR-3D, ADMET, Molecular Docking, and Molecular Dynamics Analyses

Authors: Mohamed Moussaoui

Abstract:

A series of thirty-one potential inhibitors targeting the epidermal growth factor receptor kinase (EGFR), derived from quinazoline, underwent 3D-QSAR analysis using CoMFA and CoMSIA methodologies. The training and test sets of quinazoline derivatives were utilized to construct and validate the QSAR models, respectively, with dataset alignment performed using the lowest energy conformer of the most active compound. The best-performing CoMFA and CoMSIA models demonstrated impressive determination coefficients, with R² values of 0.981 and 0.978, respectively, and Leave One Out cross-validation determination coefficients, Q², of 0.645 and 0.729, respectively. Furthermore, external validation using a test set of five compounds yielded predicted determination coefficients, R² test, of 0.929 and 0.909 for CoMFA and CoMSIA, respectively. Building upon these promising results, eighteen new compounds were designed and assessed for drug likeness and ADMET properties through in silico methods. Additionally, molecular docking studies were conducted to elucidate the binding interactions between the selected compounds and the enzyme. Detailed molecular dynamics simulations were performed to analyze the stability, conformational changes, and binding interactions of the quinazoline derivatives with the EGFR kinase. These simulations provided deeper insights into the dynamic behavior of the compounds within the active site. This comprehensive analysis enhances the understanding of quinazoline derivatives as potential anti-cancer agents and provides valuable insights for lead optimization in the early stages of drug discovery, particularly for developing highly potent anticancer therapeutics

Keywords: 3D-QSAR, CoMFA, CoMSIA, ADMET, molecular docking, quinazoline, molecular dynamic, egfr inhibitors, lung cancer, anticancer

Procedia PDF Downloads 32
1494 Validation of the Arabic Version of the InterSePT Scale for Suicidal Thinking (ISST) among the Arab Population in Qatar

Authors: S. Hammoudeh, S. Ghuloum, A. Abdelhakam, A. AlMujalli, M. Opler, Y. Hani, A. Yehya, S. Mari, R. Elsherbiny, Z. Mahfoud, H. Al-Amin

Abstract:

Introduction: Suicidal ideation and attempts are very common in patients with schizophrenia and still contributes to the high mortality in this population. The InterSePT Scale for Suicidal Thinking (ISST) is a validated tool used to assess suicidal ideation in patients with schizophrenia. This research aims to validate the Arabic version of the ISST among the Arabs residing in Qatar. Methods: Patients diagnosed with schizophrenia were recruited from the department of Psychiatry, Rumailah Hospital, Doha, Qatar. Healthy controls were recruited from the primary health care centers in Doha, Qatar. The validation procedures including professional and expert translation, pilot survey and back translation of the ISST were implemented. Diagnosis of schizophrenia was confirmed using the validated Arabic version of Mini International Neuropsychiatric Interview (MINI 6, module K) for schizophrenia. The gold standard was the module B on suicidality from MINI 6 also. This module was administered by a rater who was blinded to the results of ISST. Results: Our sample (n=199) was composed of 98 patients diagnosed with schizophrenia (age 36.03 ± 9.88 years; M/F is 2/1) and 101 healthy participants (age 35.01 ± 8.23 years; M/F is 1/2). Among patients with schizophrenia: 26.5% were married, 17.3% had a college degree, 28.6% were employed, 9% had committed suicide once, and 4.4% had more than 4 suicide attempts. Among the control group: 77.2% were married, 57.4% had a college degree, and 99% were employed. The mean score on the ISST was 2.36 ± 3.97 vs. 0.47 ± 1.44 for the schizophrenia and control groups, respectively. The overall Cronbach’s alpha was 0.91. Conclusions: This is the first study in the Arab world to validate the ISST in an Arabic-based population. The psychometric properties indicate that the Arabic version of the ISST is a valid tool to assess the severity of suicidal ideation in Arabic speaking patients diagnosed with schizophrenia.

Keywords: mental health, Qatar, schizophrenia, suicide

Procedia PDF Downloads 544
1493 Secure Content Centric Network

Authors: Syed Umair Aziz, Muhammad Faheem, Sameer Hussain, Faraz Idris

Abstract:

Content centric network is the network based on the mechanism of sending and receiving the data based on the interest and data request to the specified node (which has cached data). In this network, the security is bind with the content not with the host hence making it host independent and secure. In this network security is applied by taking content’s MAC (message authentication code) and encrypting it with the public key of the receiver. On the receiver end, the message is first verified and after verification message is saved and decrypted using the receiver's private key.

Keywords: content centric network, client-server, host security threats, message authentication code, named data network, network caching, peer-to-peer

Procedia PDF Downloads 632
1492 Effects of Changes in LULC on Hydrological Response in Upper Indus Basin

Authors: Ahmad Ammar, Umar Khan Khattak, Muhammad Majid

Abstract:

Empirically based lumped hydrologic models have an extensive track record of use for various watershed managements and flood related studies. This study focuses on the impacts of LULC change for 10 year period on the discharge in watershed using lumped model HEC-HMS. The Indus above Tarbela region acts as a source of the main flood events in the middle and lower portions of Indus because of the amount of rainfall and topographic setting of the region. The discharge pattern of the region is influenced by the LULC associated with it. In this study the Landsat TM images were used to do LULC analysis of the watershed. Satellite daily precipitation TRMM data was used as input rainfall. The input variables for model building in HEC-HMS were then calculated based on the GIS data collected and pre-processed in HEC-GeoHMS. SCS-CN was used as transform model, SCS unit hydrograph method was used as loss model and Muskingum was used as routing model. For discharge simulation years 2000 and 2010 were taken. HEC-HMS was calibrated for the year 2000 and then validated for 2010.The performance of the model was assessed through calibration and validation process and resulted R2=0.92 during calibration and validation. Relative Bias for the years 2000 was -9% and for2010 was -14%. The result shows that in 10 years the impact of LULC change on discharge has been negligible in the study area overall. One reason is that, the proportion of built-up area in the watershed, which is the main causative factor of change in discharge, is less than 1% of the total area. However, locally, the impact of development was found significant in built up area of Mansehra city. The analysis was done on Mansehra city sub-watershed with an area of about 16 km2 and has more than 13% built up area in 2010. The results showed that with an increase of 40% built-up area in the city from 2000 to 2010 the discharge values increased about 33 percent, indicating the impact of LULC change on discharge value.

Keywords: LULC change, HEC-HMS, Indus Above Tarbela, SCS-CN

Procedia PDF Downloads 498
1491 Designing Automated Embedded Assessment to Assess Student Learning in a 3D Educational Video Game

Authors: Mehmet Oren, Susan Pedersen, Sevket C. Cetin

Abstract:

Despite the frequently criticized disadvantages of the traditional used paper and pencil assessment, it is the most frequently used method in our schools. Although assessments do an acceptable measurement, they are not capable of measuring all the aspects and the richness of learning and knowledge. Also, many assessments used in schools decontextualize the assessment from the learning, and they focus on learners’ standing on a particular topic but do not concentrate on how student learning changes over time. For these reasons, many scholars advocate that using simulations and games (S&G) as a tool for assessment has significant potentials to overcome the problems in traditionally used methods. S&G can benefit from the change in technology and provide a contextualized medium for assessment and teaching. Furthermore, S&G can serve as an instructional tool rather than a method to test students’ learning at a particular time point. To investigate the potentials of using educational games as an assessment and teaching tool, this study presents the implementation and the validation of an automated embedded assessment (AEA), which can constantly monitor student learning in the game and assess their performance without intervening their learning. The experiment was conducted on an undergraduate level engineering course (Digital Circuit Design) with 99 participant students over a period of five weeks in Spring 2016 school semester. The purpose of this research study is to examine if the proposed method of AEA is valid to assess student learning in a 3D Educational game and present the implementation steps. To address this question, this study inspects three aspects of the AEA for the validation. First, the evidence-centered design model was used to lay out the design and measurement steps of the assessment. Then, a confirmatory factor analysis was conducted to test if the assessment can measure the targeted latent constructs. Finally, the scores of the assessment were compared with an external measure (a validated test measuring student learning on digital circuit design) to evaluate the convergent validity of the assessment. The results of the confirmatory factor analysis showed that the fit of the model with three latent factors with one higher order factor was acceptable (RMSEA < 0.00, CFI =1, TLI=1.013, WRMR=0.390). All of the observed variables significantly loaded to the latent factors in the latent factor model. In the second analysis, a multiple regression analysis was used to test if the external measure significantly predicts students’ performance in the game. The results of the regression indicated the two predictors explained 36.3% of the variance (R2=.36, F(2,96)=27.42.56, p<.00). It was found that students’ posttest scores significantly predicted game performance (β = .60, p < .000). The statistical results of the analyses show that the AEA can distinctly measure three major components of the digital circuit design course. It was aimed that this study can help researchers understand how to design an AEA, and showcase an implementation by providing an example methodology to validate this type of assessment.

Keywords: educational video games, automated embedded assessment, assessment validation, game-based assessment, assessment design

Procedia PDF Downloads 411
1490 Evaluating Generative Neural Attention Weights-Based Chatbot on Customer Support Twitter Dataset

Authors: Sinarwati Mohamad Suhaili, Naomie Salim, Mohamad Nazim Jambli

Abstract:

Sequence-to-sequence (seq2seq) models augmented with attention mechanisms are playing an increasingly important role in automated customer service. These models, which are able to recognize complex relationships between input and output sequences, are crucial for optimizing chatbot responses. Central to these mechanisms are neural attention weights that determine the focus of the model during sequence generation. Despite their widespread use, there remains a gap in the comparative analysis of different attention weighting functions within seq2seq models, particularly in the domain of chatbots using the Customer Support Twitter (CST) dataset. This study addresses this gap by evaluating four distinct attention-scoring functions—dot, multiplicative/general, additive, and an extended multiplicative function with a tanh activation parameter — in neural generative seq2seq models. Utilizing the CST dataset, these models were trained and evaluated over 10 epochs with the AdamW optimizer. Evaluation criteria included validation loss and BLEU scores implemented under both greedy and beam search strategies with a beam size of k=3. Results indicate that the model with the tanh-augmented multiplicative function significantly outperforms its counterparts, achieving the lowest validation loss (1.136484) and the highest BLEU scores (0.438926 under greedy search, 0.443000 under beam search, k=3). These results emphasize the crucial influence of selecting an appropriate attention-scoring function in improving the performance of seq2seq models for chatbots. Particularly, the model that integrates tanh activation proves to be a promising approach to improve the quality of chatbots in the customer support context.

Keywords: attention weight, chatbot, encoder-decoder, neural generative attention, score function, sequence-to-sequence

Procedia PDF Downloads 70
1489 Agile Software Effort Estimation Using Regression Techniques

Authors: Mikiyas Adugna

Abstract:

Effort estimation is among the activities carried out in software development processes. An accurate model of estimation leads to project success. The method of agile effort estimation is a complex task because of the dynamic nature of software development. Researchers are still conducting studies on agile effort estimation to enhance prediction accuracy. Due to these reasons, we investigated and proposed a model on LASSO and Elastic Net regression to enhance estimation accuracy. The proposed model has major components: preprocessing, train-test split, training with default parameters, and cross-validation. During the preprocessing phase, the entire dataset is normalized. After normalization, a train-test split is performed on the dataset, setting training at 80% and testing set to 20%. We chose two different phases for training the two algorithms (Elastic Net and LASSO) regression following the train-test-split. In the first phase, the two algorithms are trained using their default parameters and evaluated on the testing data. In the second phase, the grid search technique (the grid is used to search for tuning and select optimum parameters) and 5-fold cross-validation to get the final trained model. Finally, the final trained model is evaluated using the testing set. The experimental work is applied to the agile story point dataset of 21 software projects collected from six firms. The results show that both Elastic Net and LASSO regression outperformed the compared ones. Compared to the proposed algorithms, LASSO regression achieved better predictive performance and has acquired PRED (8%) and PRED (25%) results of 100.0, MMRE of 0.0491, MMER of 0.0551, MdMRE of 0.0593, MdMER of 0.063, and MSE of 0.0007. The result implies LASSO regression algorithm trained model is the most acceptable, and higher estimation performance exists in the literature.

Keywords: agile software development, effort estimation, elastic net regression, LASSO

Procedia PDF Downloads 47
1488 Evaluation and Selection of Construction Contractors by Polish Public Clients

Authors: Kozik Renata, Leśniak Agnieszka, Plebankiewicz Edyta

Abstract:

Contracting authorities in the public sector are obligated to apply the principles provided for in the Polish law for the evaluation and selection of contractors. To analyze the methods of contractors, applied in practice by public clients, the notices of contract award results for construction works were analyzed. The analysis shows that the procedure selected more and more often is open to competitive bidding, where the assessment of the competence of contractors is not very precise, as well as non-competitive bidding, i.e. single source procurement. The share of procurement procedures, where the only criterion is price, is increasing. The solution to the problems existing here might be the introduction of one of the forms of pre-selection of contractors. The article also briefly discusses verification systems for companies applying for public contracts used in EU countries.

Keywords: certification, contractors selection, open tendering, public investors

Procedia PDF Downloads 273
1487 Improved Soil and Snow Treatment with the Rapid Update Cycle Land-Surface Model for Regional and Global Weather Predictions

Authors: Tatiana G. Smirnova, Stan G. Benjamin

Abstract:

Rapid Update Cycle (RUC) land surface model (LSM) was a land-surface component in several generations of operational weather prediction models at the National Center for Environment Prediction (NCEP) at the National Oceanic and Atmospheric Administration (NOAA). It was designed for short-range weather predictions with an emphasis on severe weather and originally was intentionally simple to avoid uncertainties from poorly known parameters. Nevertheless, the RUC LSM, when coupled with the hourly-assimilating atmospheric model, can produce a realistic evolution of time-varying soil moisture and temperature, as well as the evolution of snow cover on the ground surface. This result is possible only if the soil/vegetation/snow component of the coupled weather prediction model has sufficient skill to avoid long-term drift. RUC LSM was first implemented in the operational NCEP Rapid Update Cycle (RUC) weather model in 1998 and later in the Weather Research Forecasting Model (WRF)-based Rapid Refresh (RAP) and High-resolution Rapid Refresh (HRRR). Being available to the international WRF community, it was implemented in operational weather models in Austria, New Zealand, and Switzerland. Based on the feedback from the US weather service offices and the international WRF community and also based on our own validation, RUC LSM has matured over the years. Also, a sea-ice module was added to RUC LSM for surface predictions over the Arctic sea-ice. Other modifications include refinements to the snow model and a more accurate specification of albedo, roughness length, and other surface properties. At present, RUC LSM is being tested in the regional application of the Unified Forecast System (UFS). The next generation UFS-based regional Rapid Refresh FV3 Standalone (RRFS) model will replace operational RAP and HRRR at NCEP. Over time, RUC LSM participated in several international model intercomparison projects to verify its skill using observed atmospheric forcing. The ESM-SnowMIP was the last of these experiments focused on the verification of snow models for open and forested regions. The simulations were performed for ten sites located in different climatic zones of the world forced with observed atmospheric conditions. While most of the 26 participating models have more sophisticated snow parameterizations than in RUC, RUC LSM got a high ranking in simulations of both snow water equivalent and surface temperature. However, ESM-SnowMIP experiment also revealed some issues in the RUC snow model, which will be addressed in this paper. One of them is the treatment of grid cells partially covered with snow. RUC snow module computes energy and moisture budgets of snow-covered and snow-free areas separately by aggregating the solutions at the end of each time step. Such treatment elevates the importance of computing in the model snow cover fraction. Improvements to the original simplistic threshold-based approach have been implemented and tested both offline and in the coupled weather model. The detailed description of changes to the snow cover fraction and other modifications to RUC soil and snow parameterizations will be described in this paper.

Keywords: land-surface models, weather prediction, hydrology, boundary-layer processes

Procedia PDF Downloads 77
1486 Determination of Unknown Radionuclides Using High Purity Germanium Detectors

Authors: O. G. Onuk, L. S. Taura, C. M. Eze, S. M. Ngaram

Abstract:

The decay chain of radioactive elements in the laboratory and the verification of natural radioactivity of the human body was investigated using the High Purity Germanium (HPGe) detector. Properties of the HPGe detectors were also investigated. The efficiency and energy resolution of HPGe detector used in the laboratory was found to be excellent. The detector was calibrated three times so as to cover a wider energy range. Also the Centroid C of the detector was found to have a linear relationship with the energies of the known gamma-rays. Using the three calibrations of the detector, the energy of an unknown radionuclide was found to follow the decay chain of thorium-232 (232Th) and it was also found that an average adult has about 2.5g Potasium-40 (40K) in the body.

Keywords: detector, efficiency, energy, radionuclides, resolution

Procedia PDF Downloads 240
1485 An Empirical Dynamic Fuel Cell Model Used for Power System Verification in Aerospace

Authors: Giuliano Raimondo, Jörg Wangemann, Peer Drechsel

Abstract:

In systems development involving Fuel Cells generators, it is important to have from an early stage of the project a dynamic model for the electrical behavior of the stack to be shared between involved development parties. It allows independent and early design and tests of fuel cell related power electronic. This paper presents an empirical Fuel Cell system model derived from characterization tests on a real system. Moreover, it is illustrated how the obtained model is used to build and validate a real-time Fuel Cell system emulator which is used for aerospace electrical integration testing activities.

Keywords: fuel cell, modelling, real time emulation, testing

Procedia PDF Downloads 327
1484 Analysis of iPSC-Derived Dopaminergic Neuron Susceptibility to Influenza and Excitotoxicity in Non-Affective Psychosis

Authors: Jamileh Ahmed, Helena Hernandez, Gabriel De Erausquin

Abstract:

H1N1 virus susceptibility of iPSC-derived DA neurons from schizophrenia patients and controls will compared. C57/BL-6 fibroblasts were reprogrammed into iPSCs using a lenti-viral vector containing SOKM genes. Pluripotency verification with the AP assay and immunocytochemistry ensured iPSC presence. The experimental outcome of ISPCs from DA neuron differentiation will be discussed in the Results section. Fibroblasts from patients and controls will be reprogrammed into iPSCs using a sendai-virus vector containing SOKM. IPSCs will be characterized using the AP assay, immunocytochemistry and RT-PCR. IPSCs will then be differentiated into DA neurons. Gene methylation will be compared for both groups with custom-designed microarrays.

Keywords: schizophrenia, iPSCs, stem cells, neuroscience

Procedia PDF Downloads 417
1483 Real Time Adaptive Obstacle Avoidance in Dynamic Environments with Different D-S

Authors: Mohammad Javad Mollakazemi, Farhad Asadi

Abstract:

In this paper a real-time obstacle avoidance approach for both autonomous and non-autonomous dynamical systems (DS) is presented. In this approach the original dynamics of the controller which allow us to determine safety margin can be modulated. Different common types of DS increase the robot’s reactiveness in the face of uncertainty in the localization of the obstacle especially when robot moves very fast in changeable complex environments. The method is validated by simulation and influence of different autonomous and non-autonomous DS such as important characteristics of limit cycles and unstable DS. Furthermore, the position of different obstacles in complex environment is explained. Finally, the verification of avoidance trajectories is described through different parameters such as safety factor.

Keywords: limit cycles, nonlinear dynamical system, real time obstacle avoidance, safety margin

Procedia PDF Downloads 432
1482 Wheel Diameter and Width Influence in Variability of Brake Data Measurement at Ministry of Transport Facilities

Authors: Carolina Senabre, Sergio Valero, Emilio Velasco

Abstract:

The brake systems of vehicles are tested periodically by a “brake tester” at Ministry of Transport (MOT) stations. This tester measures the effectiveness of vehicle. This parameter is established by the International Committee of Vehicle Inspection (CITA). In this paper, we present an investigation of the influence of the tire size on the measurements of brake force on three MOT brake testers. We performed an analysis of the vehicle braking capacity test at MOT stations. The influence of varying wheel diameter and width on the measurement of braking at MOT stations has been analyzed. Thereby, the MOT brake tester as a verification system for a vehicle has been evaluated.

Keywords: brake tester, ministry of transport facilities, wheel diameter, efficiency

Procedia PDF Downloads 363
1481 Cross Cultural Adaptation and Content Validation of the Assessment Instrument Preschooler Awareness of Stuttering Survey

Authors: Catarina Belchior, Catarina Martins, Sara Mendes, Ana Rita S. Valente, Elsa Marta Soares

Abstract:

Introduction: The negative feelings and attitudes that a person who stutters can develop are extremely relevant when considering assessment and intervention in Speech and Language Therapy. This relates to the fact that the person who stutters can experience feelings such as shame, fear and negative beliefs when communicating. Considering the complexity and importance of integrating diverse aspects in stuttering intervention, it is central to identify those emotions as early as possible. Therefore, this research aimed to achieve the translation, adaptation to European Portuguese and to analyze the content validation of the Preschooler Awareness Stuttering Survey (Abbiati, Guitar & Hutchins, 2015), an instrument that allows the assessment of the impact of stuttering on preschool children who stutter considering feelings and attitudes. Methodology: Cross-sectional descriptive qualitative research. The following methodological procedures were followed: translation, back-translation, panel of experts and pilot study. This abstract describes the results of the first three phases of this process. The translation was accomplished by two Speech Language Therapists (SLT). Both professionals have more than five years of experience and are users of English language. One of them has a broad experience in the field of stuttering. Back-translation was conducted by two bilingual individuals without experience in health or any knowledge about the instrument. The panel of experts was composed by 3 different SLT, experts in the field of stuttering. Results and Discussion: In the translation and back-translation process it was possible to verify differences in semantic and idiomatic equivalences of several concepts and expressions, as well as the need to include new information to enhance the understanding of the application of the instrument. The meeting between the two translators and the researchers allowed the achievement of a consensus version that was used in back-translation. Considering adaptation and content validation, the main change made by the experts was the conceptual equivalence of the questions and answers of the instrument's sheets. Considering that in the translated consensus version the questions began with various nouns such as 'is' or 'the cow' and that the answers did not contain the adverb 'much' as in the original instrument, the panel agreed that it would be more appropriate if the questions all started with 'how' and that all the answers should present the adverb 'much'. This decision was made to ensure that the translate instrument would be similar to the original and so that the results obtained could be comparable between the original and the translated instrument. There was also elaborated one semantic equivalence between concepts. The panel of experts found that all other items and specificities of the instrument were adequate, concluding the adequacy of the instrument considering its objectives and its intended target population. Conclusion: This research aspires to diversify the existing validated resources in this scope, adding a new instrument that allows the assessment of preschool children who stutter. Consequently, it is hoped that this instrument will provide a real and reliable assessment that can lead to an appropriate therapeutic intervention according to the characteristics and needs of each child.

Keywords: stuttering, assessment, feelings and attitudes, speech language therapy

Procedia PDF Downloads 138
1480 BAN Logic Proof of E-passport Authentication Protocol

Authors: Safa Saoudi, Souheib Yousfi, Riadh Robbana

Abstract:

E-passport is a relatively new electronic document which maintains the passport features and provides better security. It deploys new technologies such as biometrics and Radio Frequency identification (RFID). The international civil aviation organization (ICAO) and the European union define mechanisms and protocols to provide security but their solutions present many threats. In this paper, a new mechanism is presented to strengthen e-passport security and authentication process. We propose a new protocol based on Elliptic curve, identity based encryption and shared secret between entities. Authentication in our contribution is formally proved with BAN Logic verification language. This proposal aims to provide a secure data storage and authentication.

Keywords: e-passport, elliptic curve cryptography, identity based encryption, shared secret, BAN Logic

Procedia PDF Downloads 421
1479 Methodology for the Determination of Triterpenic Compounds in Apple Extracts

Authors: Mindaugas Liaudanskas, Darius Kviklys, Kristina Zymonė, Raimondas Raudonis, Jonas Viškelis, Norbertas Uselis, Pranas Viškelis, Valdimaras Janulis

Abstract:

Apples are among the most commonly consumed fruits in the world. Based on data from the year 2014, approximately 84.63 million tons of apples are grown per annum. Apples are widely used in food industry to produce various products and drinks (juice, wine, and cider); they are also used unprocessed. Apples in human diet are an important source of different groups of biological active compounds that can positively contribute to the prevention of various diseases. They are a source of various biologically active substances – especially vitamins, organic acids, micro- and macro-elements, pectins, and phenolic, triterpenic, and other compounds. Triterpenic compounds, which are characterized by versatile biological activity, are the biologically active compounds found in apples that are among the most promising and most significant for human health. A specific analytical procedure including sample preparation and High Performance Liquid Chromatography (HPLC) analysis was developed, optimized, and validated for the detection of triterpenic compounds in the samples of different apples, their peels, and flesh from widespread apple cultivars 'Aldas', 'Auksis', 'Connel Red', 'Ligol', 'Lodel', and 'Rajka' grown in Lithuanian climatic conditions. The conditions for triterpenic compound extraction were optimized: the solvent of the extraction was 100% (v/v) acetone, and the extraction was performed in an ultrasound bath for 10 min. Isocratic elution (the eluents ratio being 88% (solvent A) and 12% (solvent B)) for a rapid separation of triterpenic compounds was performed. The validation of the methodology was performed on the basis of the ICH recommendations. The following characteristics of validation were evaluated: the selectivity of the method (specificity), precision, the detection and quantitation limits of the analytes, and linearity. The obtained parameters values confirm suitability of methodology to perform analysis of triterpenic compounds. Using the optimised and validated HPLC technique, four triterpenic compounds were separated and identified, and their specificity was confirmed. These compounds were corosolic acid, betulinic acid, oleanolic acid, and ursolic acid. Ursolic acid was the dominant compound in all the tested apple samples. The detected amount of betulinic acid was the lowest of all the identified triterpenic compounds. The greatest amounts of triterpenic compounds were detected in whole apple and apple peel samples of the 'Lodel' cultivar, and thus apples and apple extracts of this cultivar are potentially valuable for use in medical practice, for the prevention of various diseases, for adjunct therapy, for the isolation of individual compounds with a specific biological effect, and for the development and production of dietary supplements and functional food enriched in biologically active compounds. Acknowledgements. This work was supported by a grant from the Research Council of Lithuania, project No. MIP-17-8.

Keywords: apples, HPLC, triterpenic compounds, validation

Procedia PDF Downloads 162
1478 Facial Recognition of University Entrance Exam Candidates using FaceMatch Software in Iran

Authors: Mahshid Arabi

Abstract:

In recent years, remarkable advancements in the fields of artificial intelligence and machine learning have led to the development of facial recognition technologies. These technologies are now employed in a wide range of applications, including security, surveillance, healthcare, and education. In the field of education, the identification of university entrance exam candidates has been one of the fundamental challenges. Traditional methods such as using ID cards and handwritten signatures are not only inefficient and prone to fraud but also susceptible to errors. In this context, utilizing advanced technologies like facial recognition can be an effective and efficient solution to increase the accuracy and reliability of identity verification in entrance exams. This article examines the use of FaceMatch software for recognizing the faces of university entrance exam candidates in Iran. The main objective of this research is to evaluate the efficiency and accuracy of FaceMatch software in identifying university entrance exam candidates to prevent fraud and ensure the authenticity of individuals' identities. Additionally, this research investigates the advantages and challenges of using this technology in Iran's educational systems. This research was conducted using an experimental method and random sampling. In this study, 1000 university entrance exam candidates in Iran were selected as samples. The facial images of these candidates were processed and analyzed using FaceMatch software. The software's accuracy and efficiency were evaluated using various metrics, including accuracy rate, error rate, and processing time. The research results indicated that FaceMatch software could accurately identify candidates with a precision of 98.5%. The software's error rate was less than 1.5%, demonstrating its high efficiency in facial recognition. Additionally, the average processing time for each candidate's image was less than 2 seconds, indicating the software's high efficiency. Statistical evaluation of the results using precise statistical tests, including analysis of variance (ANOVA) and t-test, showed that the observed differences were significant, and the software's accuracy in identity verification is high. The findings of this research suggest that FaceMatch software can be effectively used as a tool for identifying university entrance exam candidates in Iran. This technology not only enhances security and prevents fraud but also simplifies and streamlines the exam administration process. However, challenges such as preserving candidates' privacy and the costs of implementation must also be considered. The use of facial recognition technology with FaceMatch software in Iran's educational systems can be an effective solution for preventing fraud and ensuring the authenticity of university entrance exam candidates' identities. Given the promising results of this research, it is recommended that this technology be more widely implemented and utilized in the country's educational systems.

Keywords: facial recognition, FaceMatch software, Iran, university entrance exam

Procedia PDF Downloads 30
1477 The Analysis of TRACE/PARCS in the Simulation of Ultimate Response Guideline for Lungmen ABWR

Authors: J. R. Wang, W. Y. Li, H. T. Lin, B. H. Lee, C. Shih, S. W. Chen

Abstract:

In this research, the TRACE/PARCS model of Lungmen ABWR has been developed for verification of ultimate response guideline (URG) efficiency. This ultimate measure was named as DIVing plan, abbreviated from system depressurization, water injection and containment venting. The simulation initial condition is 100% rated power/100% rated core flow. This research focuses on the estimation of the time when the fuel might be damaged with no water injection by using TRACE/PARCS first. Then, the effect of the reactor core isolation system (RCIC), control depressurization and ac-independent water addition system (ACIWA), which can provide the injection with 950 gpm are also estimated for the station blackout (SBO) transient.

Keywords: ABWR, TRACE, safety analysis, PARCS

Procedia PDF Downloads 447