Search results for: evaluation accuracy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9832

Search results for: evaluation accuracy

9352 Fast and Accurate Finite-Difference Method Solving Multicomponent Smoluchowski Coagulation Equation

Authors: Alexander P. Smirnov, Sergey A. Matveev, Dmitry A. Zheltkov, Eugene E. Tyrtyshnikov

Abstract:

We propose a new computational technique for multidimensional (multicomponent) Smoluchowski coagulation equation. Using low-rank approximations in Tensor Train format of both the solution and the coagulation kernel, we accelerate the classical finite-difference Runge-Kutta scheme keeping its level of accuracy. The complexity of the taken finite-difference scheme is reduced from O(N^2d) to O(d^2 N log N ), where N is the number of grid nodes and d is a dimensionality of the problem. The efficiency and the accuracy of the new method are demonstrated on concrete problem with known analytical solution.

Keywords: tensor train decomposition, multicomponent Smoluchowski equation, runge-kutta scheme, convolution

Procedia PDF Downloads 436
9351 User Satisfaction Survey Based Facility Performance Evaluation

Authors: Gopikrishnan Seshadhri, V. M. Topkar

Abstract:

Facility management post occupation is a facet that has gained tremendous ground in the recent times. While the efficiency of expenditure and utilization of all types of resources are monitored to ensure timely completion with minimum cost and acceptable quality during construction phase, value for money comes out only when the facility performs satisfactorily post occupation, meeting aspirations and expectations of users of the facility. It is more so for the public facilities. Due to the paradigm shift in focus to outcome based performance evaluation, user satisfaction obtained mainly through questionnaires has become the single important criterion in performance evaluation. Questionnaires presently being used to gauge user satisfaction being subjective, the feedback obtained do not necessarily reflect actual performance. Hence, there is a requirement of developing a survey instrument that can gauge user satisfaction as objectively as possible and truly reflects the ground reality. A near correct picture of actual performance of the built facility from the user point of view will enable facility managers to address pertinent issues. This paper brings out the need for an effective survey instrument that will elicit more objective user response. It also lists steps involved in formulation of such an instrument.

Keywords: facility performance evaluation, attributes, attribute descriptors, user satisfaction surveys, statistical methods, performance indicators

Procedia PDF Downloads 296
9350 A Model Based Metaheuristic for Hybrid Hierarchical Community Structure in Social Networks

Authors: Radhia Toujani, Jalel Akaichi

Abstract:

In recent years, the study of community detection in social networks has received great attention. The hierarchical structure of the network leads to the emergence of the convergence to a locally optimal community structure. In this paper, we aim to avoid this local optimum in the introduced hybrid hierarchical method. To achieve this purpose, we present an objective function where we incorporate the value of structural and semantic similarity based modularity and a metaheuristic namely bees colonies algorithm to optimize our objective function on both hierarchical level divisive and agglomerative. In order to assess the efficiency and the accuracy of the introduced hybrid bee colony model, we perform an extensive experimental evaluation on both synthetic and real networks.

Keywords: social network, community detection, agglomerative hierarchical clustering, divisive hierarchical clustering, similarity, modularity, metaheuristic, bee colony

Procedia PDF Downloads 385
9349 The System for Root Canal Length Measurement Based on Multifrequency Impedance Method

Authors: Zheng Zhang, Xin Chen, Guoqing Ding

Abstract:

Electronic apex locators (EAL) has been widely used clinically for measuring root canal working length with high accuracy, which is crucial for successful endodontic treatment. In order to maintain high accuracy in different measurement environments, this study presented a system for root canal length measurement based on multifrequency impedance method. This measuring system can generate a sweep current with frequencies from 100 Hz to 1 MHz through a direct digital synthesizer. Multiple impedance ratios with different combinations of frequencies were obtained and transmitted by an analog-to-digital converter and several of them with representatives will be selected after data process. The system analyzed the functional relationship between these impedance ratios and the distance between the file and the apex with statistics by measuring plenty of teeth. The position of the apical foramen can be determined by the statistical model using these impedance ratios. The experimental results revealed that the accuracy of the system based on multifrequency impedance ratios method to determine the position of the apical foramen was higher than the dual-frequency impedance ratio method. Besides that, for more complex measurement environments, the performance of the system was more stable.

Keywords: root canal length, apex locator, multifrequency impedance, sweep frequency

Procedia PDF Downloads 159
9348 Feature Engineering Based Detection of Buffer Overflow Vulnerability in Source Code Using Deep Neural Networks

Authors: Mst Shapna Akter, Hossain Shahriar

Abstract:

One of the most important challenges in the field of software code audit is the presence of vulnerabilities in software source code. Every year, more and more software flaws are found, either internally in proprietary code or revealed publicly. These flaws are highly likely exploited and lead to system compromise, data leakage, or denial of service. C and C++ open-source code are now available in order to create a largescale, machine-learning system for function-level vulnerability identification. We assembled a sizable dataset of millions of opensource functions that point to potential exploits. We developed an efficient and scalable vulnerability detection method based on deep neural network models that learn features extracted from the source codes. The source code is first converted into a minimal intermediate representation to remove the pointless components and shorten the dependency. Moreover, we keep the semantic and syntactic information using state-of-the-art word embedding algorithms such as glove and fastText. The embedded vectors are subsequently fed into deep learning networks such as LSTM, BilSTM, LSTM-Autoencoder, word2vec, BERT, and GPT-2 to classify the possible vulnerabilities. Furthermore, we proposed a neural network model which can overcome issues associated with traditional neural networks. Evaluation metrics such as f1 score, precision, recall, accuracy, and total execution time have been used to measure the performance. We made a comparative analysis between results derived from features containing a minimal text representation and semantic and syntactic information. We found that all of the deep learning models provide comparatively higher accuracy when we use semantic and syntactic information as the features but require higher execution time as the word embedding the algorithm puts on a bit of complexity to the overall system.

Keywords: cyber security, vulnerability detection, neural networks, feature extraction

Procedia PDF Downloads 94
9347 Performance and Limitations of Likelihood Based Information Criteria and Leave-One-Out Cross-Validation Approximation Methods

Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer

Abstract:

Model assessment, in the Bayesian context, involves evaluation of the goodness-of-fit and the comparison of several alternative candidate models for predictive accuracy and improvements. In posterior predictive checks, the data simulated under the fitted model is compared with the actual data. Predictive model accuracy is estimated using information criteria such as the Akaike information criterion (AIC), the Bayesian information criterion (BIC), the Deviance information criterion (DIC), and the Watanabe-Akaike information criterion (WAIC). The goal of an information criterion is to obtain an unbiased measure of out-of-sample prediction error. Since posterior checks use the data twice; once for model estimation and once for testing, a bias correction which penalises the model complexity is incorporated in these criteria. Cross-validation (CV) is another method used for examining out-of-sample prediction accuracy. Leave-one-out cross-validation (LOO-CV) is the most computationally expensive variant among the other CV methods, as it fits as many models as the number of observations. Importance sampling (IS), truncated importance sampling (TIS) and Pareto-smoothed importance sampling (PSIS) are generally used as approximations to the exact LOO-CV and utilise the existing MCMC results avoiding expensive computational issues. The reciprocals of the predictive densities calculated over posterior draws for each observation are treated as the raw importance weights. These are in turn used to calculate the approximate LOO-CV of the observation as a weighted average of posterior densities. In IS-LOO, the raw weights are directly used. In contrast, the larger weights are replaced by their modified truncated weights in calculating TIS-LOO and PSIS-LOO. Although, information criteria and LOO-CV are unable to reflect the goodness-of-fit in absolute sense, the differences can be used to measure the relative performance of the models of interest. However, the use of these measures is only valid under specific circumstances. This study has developed 11 models using normal, log-normal, gamma, and student’s t distributions to improve the PCR stutter prediction with forensic data. These models are comprised of four with profile-wide variances, four with locus specific variances, and three which are two-component mixture models. The mean stutter ratio in each model is modeled as a locus specific simple linear regression against a feature of the alleles under study known as the longest uninterrupted sequence (LUS). The use of AIC, BIC, DIC, and WAIC in model comparison has some practical limitations. Even though, IS-LOO, TIS-LOO, and PSIS-LOO are considered to be approximations of the exact LOO-CV, the study observed some drastic deviations in the results. However, there are some interesting relationships among the logarithms of pointwise predictive densities (lppd) calculated under WAIC and the LOO approximation methods. The estimated overall lppd is a relative measure that reflects the overall goodness-of-fit of the model. Parallel log-likelihood profiles for the models conditional on equal posterior variances in lppds were observed. This study illustrates the limitations of the information criteria in practical model comparison problems. In addition, the relationships among LOO-CV approximation methods and WAIC with their limitations are discussed. Finally, useful recommendations that may help in practical model comparisons with these methods are provided.

Keywords: cross-validation, importance sampling, information criteria, predictive accuracy

Procedia PDF Downloads 395
9346 Learning at Workplace: Competences and Contexts in Sensory Evaluation

Authors: Ulriikka Savela-Huovinen, Hanni Muukkonen, Auli Toom

Abstract:

The development of workplace as a learning environment has been emphasized in research field of workplace learning. The prior literature on sensory performance emphasized the individual’s competences as assessor, while the competences in the collaborative interactional and knowledge creation practices as workplace learning method are not often mentioned. In the present study aims to find out what kinds of competences and contexts are central when assessor conducts food sensory evaluation in authentic professional context. The aim was to answer the following questions: first, what kinds of competences does sensory evaluation require according to assessors? And second, what kinds of contexts for sensory evaluation do assessors report? Altogether thirteen assessors from three Finnish food companies were interviewed by using semi-structural thematic interviews to map practices and development intentions as well as to explicate already established practices. The qualitative data were analyzed by following the principles of abductive and inductive content analysis. Analysis phases were combined and their results were considered together as a cross-analysis. When evaluated independently required competences were perception, knowledge of specific domains and methods and cognitive skills e.g. memory. Altogether, 42% of analysis units described individual evaluation contexts, 53% of analysis units described collaborative interactional contexts, and 5% of analysis units described collaborative knowledge creation contexts. Related to collaboration, analysis reviewed learning, sharing and reviewing both external and in-house consumer feedback, developing methods to moderate small-panel evaluation and developing product vocabulary collectively between the assessors. Knowledge creation contexts individualized from daily practices especially in cases product defects were sought and discussed. The study findings contribute to the explanation that sensory assessors learn extensively from one another in the collaborative interactional and knowledge creation context. Assessors learning and abilities to work collaboratively in the interactional and knowledge creation contexts need to be ensured in the development of the expertise.

Keywords: assessor, collaboration, competences, contexts, learning and practices, sensory evaluation

Procedia PDF Downloads 240
9345 MB-Slam: A Slam Framework for Construction Monitoring

Authors: Mojtaba Noghabaei, Khashayar Asadi, Kevin Han

Abstract:

Simultaneous Localization and Mapping (SLAM) technology has recently attracted the attention of construction companies for real-time performance monitoring. To effectively use SLAM for construction performance monitoring, SLAM results should be registered to a Building Information Models (BIM). Registring SLAM and BIM can provide essential insights for construction managers to identify construction deficiencies in real-time and ultimately reduce rework. Also, registering SLAM to BIM in real-time can boost the accuracy of SLAM since SLAM can use features from both images and 3d models. However, registering SLAM with the BIM in real-time is a challenge. In this study, a novel SLAM platform named Model-Based SLAM (MB-SLAM) is proposed, which not only provides automated registration of SLAM and BIM but also improves the localization accuracy of the SLAM system in real-time. This framework improves the accuracy of SLAM by aligning perspective features such as depth, vanishing points, and vanishing lines from the BIM to the SLAM system. This framework extracts depth features from a monocular camera’s image and improves the localization accuracy of the SLAM system through a real-time iterative process. Initially, SLAM can be used to calculate a rough camera pose for each keyframe. In the next step, each SLAM video sequence keyframe is registered to the BIM in real-time by aligning the keyframe’s perspective with the equivalent BIM view. The alignment method is based on perspective detection that estimates vanishing lines and points by detecting straight edges on images. This process will generate the associated BIM views from the keyframes' views. The calculated poses are later improved during a real-time gradient descent-based iteration method. Two case studies were presented to validate MB-SLAM. The validation process demonstrated promising results and accurately registered SLAM to BIM and significantly improved the SLAM’s localization accuracy. Besides, MB-SLAM achieved real-time performance in both indoor and outdoor environments. The proposed method can fully automate past studies and generate as-built models that are aligned with BIM. The main contribution of this study is a SLAM framework for both research and commercial usage, which aims to monitor construction progress and performance in a unified framework. Through this platform, users can improve the accuracy of the SLAM by providing a rough 3D model of the environment. MB-SLAM further boosts the application to practical usage of the SLAM.

Keywords: perspective alignment, progress monitoring, slam, stereo matching.

Procedia PDF Downloads 232
9344 Floor Response Spectra of RC Frames: Influence of the Infills on the Seismic Demand on Non-Structural Components

Authors: Gianni Blasi, Daniele Perrone, Maria Antonietta Aiello

Abstract:

The seismic vulnerability of non-structural components is nowadays recognized to be a key issue in performance-based earthquake engineering. Recent loss estimation studies, as well as the damage observed during past earthquakes, evidenced how non-structural damage represents the highest rate of economic loss in a building and can be in many cases crucial in a life-safety view during the post-earthquake emergency. The procedures developed to evaluate the seismic demand on non-structural components have been constantly improved and recent studies demonstrated how the existing formulations provided by main Standards generally ignore features which have a sensible influence on the definition of the seismic acceleration/displacements subjecting non-structural components. Since the influence of the infills on the dynamic behaviour of RC structures has already been evidenced by many authors, it is worth to be noted that the evaluation of the seismic demand on non-structural components should consider the presence of the infills as well as their mechanical properties. This study focuses on the evaluation of time-history floor acceleration in RC buildings; which is a useful mean to perform seismic vulnerability analyses of non-structural components through the well-known cascade method. Dynamic analyses are performed on an 8-storey RC frame, taking into account the presence of the infills; the influence of the elastic modulus of the panel on the results is investigated as well as the presence of openings. Floor accelerations obtained from the analyses are used to evaluate the floor response spectra, in order to define the demand on non-structural components depending on the properties of the infills. Finally, the results are compared with formulations provided by main International Standards, in order to assess the accuracy and eventually define the improvements required according to the results of the present research work.

Keywords: floor spectra, infilled RC frames, non-structural components, seismic demand

Procedia PDF Downloads 329
9343 Quality of Life Measurements: Evaluation of Intervention Program of Persons with Addiction

Authors: Julie Wittmannová, Petr Šeda

Abstract:

Quality of life measurements (QLF) help to evaluate interventions programs in different groups of persons with special needs. Our presentation deals with QLF of persons with addiction in relation to the physical activity (PA), type of addiction, age, gender and other variables. The aim of presentation is to summarize the basic findings and offer thoughts for questions arose. Methods: SQUALA (Subjective Quality of Life Analysis); SEIQoL (Schedule for the Evaluation of Individual Quality of Life); questionnaire of own construction. The results are evaluated by Mann­Whitney U test and Kruskall­Wallis ANOVA test (p ≤ 0,05). Sample of 64 participants – clients of aftercare center, aged 18 plus. Findings: Application of the methods SQUALA and SEIQoL in the chosen population seems appropriate, the obtaining information regarding the QLF correlate to intervention program topics, the need of an activelifestyle and health related topics in persons with addiction is visible. Conclusions or Implications: The subjective evaluation of quality of life of Aftercare clients is an important part of evaluation process, especially used to evaluate satisfaction with offered services and programs. Techniques SQUALA and SEIQoL gave us the desired outcomes.

Keywords: adapted physical activity, addiction, quality of life, physical activity, aftercare

Procedia PDF Downloads 335
9342 Questionnaire for the Evaluation of Entrepreneurship Project Psychopedagogical Practices: Construction Proceedings and Validation

Authors: Cristina Costa-Lobo, Sandra Fernandes, Miguel Magalhães, José Dinis-Carvalho, Alfredo Regueiro, Ana Carvalho

Abstract:

This paper is a report on the findings of the construction and the validation of a questionnaire monetized in a portuguese higher education context with undergraduate students. The Questionnaire for the Evaluation of Entrepreneurship Project Psychopedagogical Practices consists of six scales: Critical appraisal of the project, Developed Learning and Skills, Teamwork, Teacher and Tutor Roles, Evaluation of Student Performance, and Project Effectiveness as a Teaching-Learning Methodology. The proceedings of its construction are analyzed, and the validity and internal consistency analysis are described. Findings indicate good indicators of validity, good fidelity and an interpretable factorial structure.

Keywords: entrepreneurship project, higher education, psychopedagogical practices, teacher and tutor roles

Procedia PDF Downloads 385
9341 A Closed-Loop Design Model for Sustainable Manufacturing by Integrating Forward Design and Reverse Design

Authors: Yuan-Jye Tseng, Yi-Shiuan Chen

Abstract:

In this paper, a new concept of closed-loop design model is presented. The closed-loop design model is developed by integrating forward design and reverse design. Based on this new concept, a closed-loop design model for sustainable manufacturing by integrated evaluation of forward design, reverse design, and green manufacturing using a fuzzy analytic network process is developed. In the design stage of a product, with a given product requirement and objective, there can be different ways to design the detailed components and specifications. Therefore, there can be different design cases to achieve the same product requirement and objective. Thus, in the design evaluation stage, it is required to analyze and evaluate the different design cases. The purpose of this research is to develop a model for evaluating the design cases by integrated evaluation of forward design, reverse design, and green manufacturing models. A fuzzy analytic network process model is presented for integrated evaluation of the criteria in the three models. The comparison matrices for evaluating the criteria in the three groups are established. The total relational values among the three groups represent the total relational effects. In application, a super matrix can be created and the total relational values can be used to evaluate the design cases for decision-making to select the final design case. An example product is demonstrated in this presentation. It shows that the model is useful for integrated evaluation of forward design, reverse design, and green manufacturing to achieve a closed-loop design for sustainable manufacturing objective.

Keywords: design evaluation, forward design, reverse design, closed-loop design, supply chain management, closed-loop supply chain, fuzzy analytic network process

Procedia PDF Downloads 680
9340 Facial Emotion Recognition Using Deep Learning

Authors: Ashutosh Mishra, Nikhil Goyal

Abstract:

A 3D facial emotion recognition model based on deep learning is proposed in this paper. Two convolution layers and a pooling layer are employed in the deep learning architecture. After the convolution process, the pooling is finished. The probabilities for various classes of human faces are calculated using the sigmoid activation function. To verify the efficiency of deep learning-based systems, a set of faces. The Kaggle dataset is used to verify the accuracy of a deep learning-based face recognition model. The model's accuracy is about 65 percent, which is lower than that of other facial expression recognition techniques. Despite significant gains in representation precision due to the nonlinearity of profound image representations.

Keywords: facial recognition, computational intelligence, convolutional neural network, depth map

Procedia PDF Downloads 235
9339 Morphological Evaluation of Mesenchymal Stem Cells Derived from Adipose Tissue of Dog Treated with Different Concentrations of Nano-Hydroxy Apatite

Authors: K. Barbaro, F. Di Egidio, A. Amaddeo, G. Lupoli, S. Eramo, G. Barraco, D. Amaddeo, C. Gallottini

Abstract:

In this study, we wanted to evaluate the effects of nano-hydroxy apatite (NHA) on mesenchymal stem cells extracted from subcutaneous adipose tissue of the dog. The stem cells were divided into 6 experimental groups at different concentrations of NHA. The comparison was made with a control group of stem cell grown in standard conditions without NHA. After 1 week, the cells were fixed with 10% buffered formalin for 1 hour at room temperature and stained with Giemsa, measured at the inverted optical microscope. The morphological evaluation of the control samples and those treated showed that stem cells adhere to the substrate and proliferate in the presence of nanohydroxy apatite at different concentrations showing no detectable toxic effects.

Keywords: nano-hydroxy apatite, adipose mesenchymal stem cells, dog, morphological evaluation

Procedia PDF Downloads 478
9338 Predicting the Diagnosis of Alzheimer’s Disease: Development and Validation of Machine Learning Models

Authors: Jay L. Fu

Abstract:

Patients with Alzheimer's disease progressively lose their memory and thinking skills and, eventually, the ability to carry out simple daily tasks. The disease is irreversible, but early detection and treatment can slow down the disease progression. In this research, publicly available MRI data and demographic data from 373 MRI imaging sessions were utilized to build models to predict dementia. Various machine learning models, including logistic regression, k-nearest neighbor, support vector machine, random forest, and neural network, were developed. Data were divided into training and testing sets, where training sets were used to build the predictive model, and testing sets were used to assess the accuracy of prediction. Key risk factors were identified, and various models were compared to come forward with the best prediction model. Among these models, the random forest model appeared to be the best model with an accuracy of 90.34%. MMSE, nWBV, and gender were the three most important contributing factors to the detection of Alzheimer’s. Among all the models used, the percent in which at least 4 of the 5 models shared the same diagnosis for a testing input was 90.42%. These machine learning models allow early detection of Alzheimer’s with good accuracy, which ultimately leads to early treatment of these patients.

Keywords: Alzheimer's disease, clinical diagnosis, magnetic resonance imaging, machine learning prediction

Procedia PDF Downloads 146
9337 On Phase Based Stereo Matching and Its Related Issues

Authors: András Rövid, Takeshi Hashimoto

Abstract:

The paper focuses on the problem of the point correspondence matching in stereo images. The proposed matching algorithm is based on the combination of simpler methods such as normalized sum of squared differences (NSSD) and a more complex phase correlation based approach, by considering the noise and other factors, as well. The speed of NSSD and the preciseness of the phase correlation together yield an efficient approach to find the best candidate point with sub-pixel accuracy in stereo image pairs. The task of the NSSD in this case is to approach the candidate pixel roughly. Afterwards the location of the candidate is refined by an enhanced phase correlation based method which in contrast to the NSSD has to run only once for each selected pixel.

Keywords: stereo matching, sub-pixel accuracy, phase correlation, SVD, NSSD

Procedia PDF Downloads 471
9336 A Unified Fitting Method for the Set of Unified Constitutive Equations for Modelling Microstructure Evolution in Hot Deformation

Authors: Chi Zhang, Jun Jiang

Abstract:

Constitutive equations are very important in finite element (FE) modeling, and the accuracy of the material constants in the equations have significant effects on the accuracy of the FE models. A wide range of constitutive equations are available; however, fitting the material constants in the constitutive equations could be complex and time-consuming due to the strong non-linearity and relationship between the constants. This work will focus on the development of a set of unified MATLAB programs for fitting the material constants in the constitutive equations efficiently. Users will only need to supply experimental data in the required format and run the program without modifying functions or precisely guessing the initial values, or finding the parameters in previous works and will be able to fit the material constants efficiently.

Keywords: constitutive equations, FE modelling, MATLAB program, non-linear curve fitting

Procedia PDF Downloads 101
9335 Application of Groundwater Level Data Mining in Aquifer Identification

Authors: Liang Cheng Chang, Wei Ju Huang, You Cheng Chen

Abstract:

Investigation and research are keys for conjunctive use of surface and groundwater resources. The hydrogeological structure is an important base for groundwater analysis and simulation. Traditionally, the hydrogeological structure is artificially determined based on geological drill logs, the structure of wells, groundwater levels, and so on. In Taiwan, groundwater observation network has been built and a large amount of groundwater-level observation data are available. The groundwater level is the state variable of the groundwater system, which reflects the system response combining hydrogeological structure, groundwater injection, and extraction. This study applies analytical tools to the observation database to develop a methodology for the identification of confined and unconfined aquifers. These tools include frequency analysis, cross-correlation analysis between rainfall and groundwater level, groundwater regression curve analysis, and decision tree. The developed methodology is then applied to groundwater layer identification of two groundwater systems: Zhuoshui River alluvial fan and Pingtung Plain. The abovementioned frequency analysis uses Fourier Transform processing time-series groundwater level observation data and analyzing daily frequency amplitude of groundwater level caused by artificial groundwater extraction. The cross-correlation analysis between rainfall and groundwater level is used to obtain the groundwater replenishment time between infiltration and the peak groundwater level during wet seasons. The groundwater regression curve, the average rate of groundwater regression, is used to analyze the internal flux in the groundwater system and the flux caused by artificial behaviors. The decision tree uses the information obtained from the above mentioned analytical tools and optimizes the best estimation of the hydrogeological structure. The developed method reaches training accuracy of 92.31% and verification accuracy 93.75% on Zhuoshui River alluvial fan and training accuracy 95.55%, and verification accuracy 100% on Pingtung Plain. This extraordinary accuracy indicates that the developed methodology is a great tool for identifying hydrogeological structures.

Keywords: aquifer identification, decision tree, groundwater, Fourier transform

Procedia PDF Downloads 160
9334 Evaluation of Competency Training Effectiveness in Chosen Sales Departments

Authors: L. Pigon, S. Kot, J. K. Grabara

Abstract:

Nowadays, with organizations facing the challenges of increasing competitiveness, human capital accumulated by the organization is one of the elements that strongly differentiate between companies. Efficient management in the competition area requires to manage the competencies of their employees to be suitable to the market fluctuations. The aim of the paper was to determine how employee training to improve their competencies is verified. The survey was conducted among 37 respondents involved in selection of training providers and training programs in their enterprises. The results showed that all organizations use training survey as a basic method for evaluation of training effectiveness. Depending on the training contents and organization, the questionnaires contain various questions. Most of these surveys are composed of the three basic blocks: the trainer's assessment, the evaluation of the training contents, the assessment of the materials and the place of the organisation. None of the organization surveys conducted regular job-related observations or examined the attitudes of the training participants.

Keywords: human capital, competencies, training effectiveness, sale department

Procedia PDF Downloads 181
9333 Impact of Digitized Monitoring & Evaluation System in Technical Vocational Education and Training

Authors: Abdul Ghani Rajput

Abstract:

Although monitoring and evaluation concept adopted by Technical Vocational Education and Training (TVET) organization to track the progress over the continuous interval of time based on planned interventions and subsequently, evaluating it for the impact, quality assurance and sustainability. In digital world, TVET providers are giving preference to have real time information to do monitoring of training activities. Identifying the benefits and challenges of digitized monitoring & evaluation real time information system has not been sufficiently tackled in this date. This research paper looks at the impact of digitized M&E in TVET sector by analyzing two case studies and describe the benefits and challenges of using digitized M&E system. Finally, digitized M&E have been identified as carriers for high potential of TVET sector.

Keywords: digitized M&E, innovation, quality assurance, TVET

Procedia PDF Downloads 235
9332 Analysis and Rule Extraction of Coronary Artery Disease Data Using Data Mining

Authors: Rezaei Hachesu Peyman, Oliyaee Azadeh, Salahzadeh Zahra, Alizadeh Somayyeh, Safaei Naser

Abstract:

Coronary Artery Disease (CAD) is one major cause of disability in adults and one main cause of death in developed. In this study, data mining techniques including Decision Trees, Artificial neural networks (ANNs), and Support Vector Machine (SVM) analyze CAD data. Data of 4948 patients who had suffered from heart diseases were included in the analysis. CAD is the target variable, and 24 inputs or predictor variables are used for the classification. The performance of these techniques is compared in terms of sensitivity, specificity, and accuracy. The most significant factor influencing CAD is chest pain. Elderly males (age > 53) have a high probability to be diagnosed with CAD. SVM algorithm is the most useful way for evaluation and prediction of CAD patients as compared to non-CAD ones. Application of data mining techniques in analyzing coronary artery diseases is a good method for investigating the existing relationships between variables.

Keywords: classification, coronary artery disease, data-mining, knowledge discovery, extract

Procedia PDF Downloads 663
9331 Teaching University Students Lateral Reading to Detect Disinformation and Misinformation

Authors: Diane Prorak, Perri Moreno

Abstract:

University students may have been born in the digital age, but they need to be taught the critical thinking skills to detect misinformation and social media manipulation online. In recent years, librarians have been active in designing instructional methods to help students learn information evaluation skills. At the University of Idaho Library (USA), librarians have developed new teaching methods for these skills. Last academic year, when classes were taught via Zoom, librarians taught these skills to an online session of each first-year rhetoric and composition course. In the Zoom sessions, students were placed in breakout groups where they practiced using an evaluation method known as lateral reading. Online collaborative software was used to give each group an evaluative task and break the task into steps. Groups reported back to the full class. Students learned to look at an information source, then search outside the source to find information about the organization, publisher or author, before evaluating the source itself. Class level pre-and post-test comparison results showed students learned better techniques for evaluation than they knew before instruction.

Keywords: critical thinking, information evaluation, information literacy instruction, lateral reading.

Procedia PDF Downloads 188
9330 Influence of High-Resolution Satellites Attitude Parameters on Image Quality

Authors: Walid Wahballah, Taher Bazan, Fawzy Eltohamy

Abstract:

One of the important functions of the satellite attitude control system is to provide the required pointing accuracy and attitude stability for optical remote sensing satellites to achieve good image quality. Although offering noise reduction and increased sensitivity, time delay and integration (TDI) charge coupled devices (CCDs) utilized in high-resolution satellites (HRS) are prone to introduce large amounts of pixel smear due to the instability of the line of sight. During on-orbit imaging, as a result of the Earth’s rotation and the satellite platform instability, the moving direction of the TDI-CCD linear array and the imaging direction of the camera become different. The speed of the image moving on the image plane (focal plane) represents the image motion velocity whereas the angle between the two directions is known as the drift angle (β). The drift angle occurs due to the rotation of the earth around its axis during satellite imaging; affecting the geometric accuracy and, consequently, causing image quality degradation. Therefore, the image motion velocity vector and the drift angle are two important factors used in the assessment of the image quality of TDI-CCD based optical remote sensing satellites. A model for estimating the image motion velocity and the drift angle in HRS is derived. The six satellite attitude control parameters represented in the derived model are the (roll angle φ, pitch angle θ, yaw angle ψ, roll angular velocity φ֗, pitch angular velocity θ֗ and yaw angular velocity ψ֗ ). The influence of these attitude parameters on the image quality is analyzed by establishing a relationship between the image motion velocity vector, drift angle and the six satellite attitude parameters. The influence of the satellite attitude parameters on the image quality is assessed by the presented model in terms of modulation transfer function (MTF) in both cross- and along-track directions. Three different cases representing the effect of pointing accuracy (φ, θ, ψ) bias are considered using four different sets of pointing accuracy typical values, while the satellite attitude stability parameters are ideal. In the same manner, the influence of satellite attitude stability (φ֗, θ֗, ψ֗) on image quality is also analysed for ideal pointing accuracy parameters. The results reveal that cross-track image quality is influenced seriously by the yaw angle bias and the roll angular velocity bias, while along-track image quality is influenced only by the pitch angular velocity bias.

Keywords: high-resolution satellites, pointing accuracy, attitude stability, TDI-CCD, smear, MTF

Procedia PDF Downloads 404
9329 Blood Glucose Level Measurement from Breath Analysis

Authors: Tayyab Hassan, Talha Rehman, Qasim Abdul Aziz, Ahmad Salman

Abstract:

The constant monitoring of blood glucose level is necessary for maintaining health of patients and to alert medical specialists to take preemptive measures before the onset of any complication as a result of diabetes. The current clinical monitoring of blood glucose uses invasive methods repeatedly which are uncomfortable and may result in infections in diabetic patients. Several attempts have been made to develop non-invasive techniques for blood glucose measurement. In this regard, the existing methods are not reliable and are less accurate. Other approaches claiming high accuracy have not been tested on extended dataset, and thus, results are not statistically significant. It is a well-known fact that acetone concentration in breath has a direct relation with blood glucose level. In this paper, we have developed the first of its kind, reliable and high accuracy breath analyzer for non-invasive blood glucose measurement. The acetone concentration in breath was measured using MQ 138 sensor in the samples collected from local hospitals in Pakistan involving one hundred patients. The blood glucose levels of these patients are determined using conventional invasive clinical method. We propose a linear regression classifier that is trained to map breath acetone level to the collected blood glucose level achieving high accuracy.

Keywords: blood glucose level, breath acetone concentration, diabetes, linear regression

Procedia PDF Downloads 174
9328 Enhancing Higher Education Teaching and Learning Processes: Examining How Lecturer Evaluation Make a Difference

Authors: Daniel Asiamah Ameyaw

Abstract:

This research attempts to investigate how lecturer evaluation makes a difference in enhancing higher education teaching and learning processes. The research questions to guide this research work states first as, “What are the perspectives on the difference made by evaluating academic teachers in order to enhance higher education teaching and learning processes?” and second, “What are the implications of the findings for Policy and Practice?” Data for this research was collected mainly through interviewing and partly documents review. Data analysis was conducted under the framework of grounded theory. The findings showed that for individual lecturer level, lecturer evaluation provides a continuous improvement of teaching strategies, and serves as source of data for research on teaching. At the individual student level, it enhances students learning process; serving as source of information for course selection by students; and by making students feel recognised in the educational process. At the institutional level, it noted that lecturer evaluation is useful in personnel and management decision making; it assures stakeholders of quality teaching and learning by setting up standards for lecturers; and it enables institutions to identify skill requirement and needs as a basis for organising workshops. Lecturer evaluation is useful at national level in terms of guaranteeing the competencies of graduates who then provide the needed manpower requirement of the nation. Besides, it mentioned that resource allocation to higher educational institution is based largely on quality of the programmes being run by the institution. The researcher concluded, that the findings have implications for policy and practice, therefore, higher education managers are expected to ensure that policy is implemented as planned by policy-makers so that the objectives can successfully be achieved.

Keywords: academic quality, higher education, lecturer evaluation, teaching and learning processes

Procedia PDF Downloads 147
9327 An MrPPG Method for Face Anti-Spoofing

Authors: Lan Zhang, Cailing Zhang

Abstract:

In recent years, many face anti-spoofing algorithms have high detection accuracy when detecting 2D face anti-spoofing or 3D mask face anti-spoofing alone in the field of face anti-spoofing, but their detection performance is greatly reduced in multidimensional and cross-datasets tests. The rPPG method used for face anti-spoofing uses the unique vital information of real face to judge real faces and face anti-spoofing, so rPPG method has strong stability compared with other methods, but its detection rate of 2D face anti-spoofing needs to be improved. Therefore, in this paper, we improve an rPPG(Remote Photoplethysmography) method(MrPPG) for face anti-spoofing which through color space fusion, using the correlation of pulse signals between real face regions and background regions, and introducing the cyclic neural network (LSTM) method to improve accuracy in 2D face anti-spoofing. Meanwhile, the MrPPG also has high accuracy and good stability in face anti-spoofing of multi-dimensional and cross-data datasets. The improved method was validated on Replay-Attack, CASIA-FASD, Siw and HKBU_MARs_V2 datasets, the experimental results show that the performance and stability of the improved algorithm proposed in this paper is superior to many advanced algorithms.

Keywords: face anti-spoofing, face presentation attack detection, remote photoplethysmography, MrPPG

Procedia PDF Downloads 185
9326 Morphology Operation and Discrete Wavelet Transform for Blood Vessels Segmentation in Retina Fundus

Authors: Rita Magdalena, N. K. Caecar Pratiwi, Yunendah Nur Fuadah, Sofia Saidah, Bima Sakti

Abstract:

Vessel segmentation of retinal fundus is important for biomedical sciences in diagnosing ailments related to the eye. Segmentation can simplify medical experts in diagnosing retinal fundus image state. Therefore, in this study, we designed a software using MATLAB which enables the segmentation of the retinal blood vessels on retinal fundus images. There are two main steps in the process of segmentation. The first step is image preprocessing that aims to improve the quality of the image to be optimum segmented. The second step is the image segmentation in order to perform the extraction process to retrieve the retina’s blood vessel from the eye fundus image. The image segmentation methods that will be analyzed in this study are Morphology Operation, Discrete Wavelet Transform and combination of both. The amount of data that used in this project is 40 for the retinal image and 40 for manually segmentation image. After doing some testing scenarios, the average accuracy for Morphology Operation method is 88.46 % while for Discrete Wavelet Transform is 89.28 %. By combining the two methods mentioned in later, the average accuracy was increased to 89.53 %. The result of this study is an image processing system that can segment the blood vessels in retinal fundus with high accuracy and low computation time.

Keywords: discrete wavelet transform, fundus retina, morphology operation, segmentation, vessel

Procedia PDF Downloads 198
9325 TomoTherapy® System Repositioning Accuracy According to Treatment Localization

Authors: Veronica Sorgato, Jeremy Belhassen, Philippe Chartier, Roddy Sihanath, Nicolas Docquiere, Jean-Yves Giraud

Abstract:

We analyzed the image-guided radiotherapy method used by the TomoTherapy® System (Accuray Corp.) for patient repositioning in clinical routine. The TomoTherapy® System computes X, Y, Z and roll displacements to match the reference CT, on which the dosimetry has been performed, with the pre-treatment MV CT. The accuracy of the repositioning method has been studied according to the treatment localization. For this, a database of 18774 treatment sessions, performed during 2 consecutive years (2016-2017 period) has been used. The database includes the X, Y, Z and roll displacements proposed by TomoTherapy® System as well as the manual correction of these proposals applied by the radiation therapist. This manual correction aims to further improve the repositioning based on the clinical situation and depends on the structures surrounding the target tumor tissue. The statistical analysis performed on the database aims to define repositioning limits to be used as security and guiding tool for the manual adjustment implemented by the radiation therapist. This tool will participate not only to notify potential repositioning errors but also to further improve patient positioning for optimal treatment.

Keywords: accuracy, IGRT MVCT, image-guided radiotherapy megavoltage computed tomography, statistical analysis, tomotherapy, localization

Procedia PDF Downloads 231
9324 Aberrant Consumer Behavior in Seller’s and Consumer’s Eyes: Newly Developed Classification

Authors: Amal Abdelhadi

Abstract:

Consumer misbehavior evaluation can be markedly different based on a number of variables and different from one environment to another. Using three aberrant consumer behavior (ACB) scenarios (shoplifting, stealing from hotel rooms and software piracy) this study aimed to explore Libyan seller and consumers of ACB. Materials were collected by using a multi-method approach was employed (qualitative and quantitative approaches) in two fieldwork phases. In the phase stage, a qualitative data were collected from 26 Libyan sellers’ by face-to-face interviews. In the second stage, a consumer survey was used to collect quantitative data from 679 Libyan consumers. This study found that the consumer’s and seller’s evaluation of ACB are not always consistent. Further, ACB evaluations differed based on the form of ACB. Furthermore, the study found that not all consumer behaviors that were considered as bad behavior in other countries have the same evaluation in Libya; for example, software piracy. Therefore this study suggested a newly developed classification of ACB based on marketers’ and consumers’ views. This classification provides 9 ACB types within two dimensions (marketers’ and consumers’ views) and three degrees of behavior evaluation (good, acceptable and misbehavior).

Keywords: aberrant consumer behavior, Libya, multi-method approach, planned behavior theory

Procedia PDF Downloads 580
9323 Radar-Based Classification of Pedestrian and Dog Using High-Resolution Raw Range-Doppler Signatures

Authors: C. Mayr, J. Periya, A. Kariminezhad

Abstract:

In this paper, we developed a learning framework for the classification of vulnerable road users (VRU) by their range-Doppler signatures. The frequency-modulated continuous-wave (FMCW) radar raw data is first pre-processed to obtain robust object range-Doppler maps per coherent time interval. The complex-valued range-Doppler maps captured from our outdoor measurements are further fed into a convolutional neural network (CNN) to learn the classification. This CNN has gone through a hyperparameter optimization process for improved learning. By learning VRU range-Doppler signatures, the three classes 'pedestrian', 'dog', and 'noise' are classified with an average accuracy of almost 95%. Interestingly, this classification accuracy holds for a combined longitudinal and lateral object trajectories.

Keywords: machine learning, radar, signal processing, autonomous driving

Procedia PDF Downloads 250