Search results for: evaluation accuracy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9450

Search results for: evaluation accuracy

9000 Violence Detection and Tracking on Moving Surveillance Video Using Machine Learning Approach

Authors: Abe Degale D., Cheng Jian

Abstract:

When creating automated video surveillance systems, violent action recognition is crucial. In recent years, hand-crafted feature detectors have been the primary method for achieving violence detection, such as the recognition of fighting activity. Researchers have also looked into learning-based representational models. On benchmark datasets created especially for the detection of violent sequences in sports and movies, these methods produced good accuracy results. The Hockey dataset's videos with surveillance camera motion present challenges for these algorithms for learning discriminating features. Image recognition and human activity detection challenges have shown success with deep representation-based methods. For the purpose of detecting violent images and identifying aggressive human behaviours, this research suggested a deep representation-based model using the transfer learning idea. The results show that the suggested approach outperforms state-of-the-art accuracy levels by learning the most discriminating features, attaining 99.34% and 99.98% accuracy levels on the Hockey and Movies datasets, respectively.

Keywords: violence detection, faster RCNN, transfer learning and, surveillance video

Procedia PDF Downloads 76
8999 Some Accuracy Related Aspects in Two-Fluid Hydrodynamic Sub-Grid Modeling of Gas-Solid Riser Flows

Authors: Joseph Mouallem, Seyed Reza Amini Niaki, Norman Chavez-Cussy, Christian Costa Milioli, Fernando Eduardo Milioli

Abstract:

Sub-grid closures for filtered two-fluid models (fTFM) useful in large scale simulations (LSS) of riser flows can be derived from highly resolved simulations (HRS) with microscopic two-fluid modeling (mTFM). Accurate sub-grid closures require accurate mTFM formulations as well as accurate correlation of relevant filtered parameters to suitable independent variables. This article deals with both of those issues. The accuracy of mTFM is touched by assessing the impact of gas sub-grid turbulence over HRS filtered predictions. A gas turbulence alike effect is artificially inserted by means of a stochastic forcing procedure implemented in the physical space over the momentum conservation equation of the gas phase. The correlation issue is touched by introducing a three-filtered variable correlation analysis (three-marker analysis) performed under a variety of different macro-scale conditions typical or risers. While the more elaborated correlation procedure clearly improved accuracy, accounting for gas sub-grid turbulence had no significant impact over predictions.

Keywords: fluidization, gas-particle flow, two-fluid model, sub-grid models, filtered closures

Procedia PDF Downloads 101
8998 Predictors of Non-Alcoholic Fatty Liver Disease in Egyptian Obese Adolescents

Authors: Moushira Zaki, Wafaa Ezzat, Yasser Elhosary, Omnia Saleh

Abstract:

Nonalcoholic fatty liver disease (NAFLD) has increased in conjunction with obesity. The accuracy of risk factors for detecting NAFLD in obese adolescents has not undergone a formal evaluation. The aim of this study was to evaluate predictors of NAFLD among Egyptian female obese adolescents. The study included 162 obese female adolescents. All were subjected to anthropometry, biochemical analysis and abdominal ultrasongraphic assessment. Metabolic syndrome (MS) was diagnosed according to the IDF criteria. Significant association between presence of MS and NAFLD was observed. Obese adolescents with NAFLD had significantly higher levels of ALT, triglycerides, fasting glucose, insulin, blood pressure and HOMA-IR, whereas decreased HDL-C levels as compared with obese cases without NAFLD. Receiver–operating characteristic (ROC) curve analysis shows that ALT is a sensitive predictor for NAFLD, confirming that ALT can be used as a marker of NAFLD.

Keywords: obesity, NAFLD, predictors, adolescents, Egyptians, risk factors, prevalence

Procedia PDF Downloads 361
8997 Comparison of the Distillation Curve Obtained Experimentally with the Curve Extrapolated by a Commercial Simulator

Authors: Lívia B. Meirelles, Erika C. A. N. Chrisman, Flávia B. de Andrade, Lilian C. M. de Oliveira

Abstract:

True Boiling Point distillation (TBP) is one of the most common experimental techniques for the determination of petroleum properties. This curve provides information about the performance of petroleum in terms of its cuts. The experiment is performed in a few days. Techniques are used to determine the properties faster with a software that calculates the distillation curve when a little information about crude oil is known. In order to evaluate the accuracy of distillation curve prediction, eight points of the TBP curve and specific gravity curve (348 K and 523 K) were inserted into the HYSYS Oil Manager, and the extended curve was evaluated up to 748 K. The methods were able to predict the curve with the accuracy of 0.6%-9.2% error (Software X ASTM), 0.2%-5.1% error (Software X Spaltrohr).

Keywords: distillation curve, petroleum distillation, simulation, true boiling point curve

Procedia PDF Downloads 419
8996 The Synergistic Effects of Blockchain and AI on Enhancing Data Integrity and Decision-Making Accuracy in Smart Contracts

Authors: Sayor Ajfar Aaron, Sajjat Hossain Abir, Ashif Newaz, Mushfiqur Rahman

Abstract:

Investigating the convergence of blockchain technology and artificial intelligence, this paper examines their synergistic effects on data integrity and decision-making within smart contracts. By implementing AI-driven analytics on blockchain-based platforms, the research identifies improvements in automated contract enforcement and decision accuracy. The paper presents a framework that leverages AI to enhance transparency and trust while blockchain ensures immutable record-keeping, culminating in significantly optimized operational efficiencies in various industries.

Keywords: artificial intelligence, blockchain, data integrity, smart contracts

Procedia PDF Downloads 19
8995 Sea-Land Segmentation Method Based on the Transformer with Enhanced Edge Supervision

Authors: Lianzhong Zhang, Chao Huang

Abstract:

Sea-land segmentation is a basic step in many tasks such as sea surface monitoring and ship detection. The existing sea-land segmentation algorithms have poor segmentation accuracy, and the parameter adjustments are cumbersome and difficult to meet actual needs. Also, the current sea-land segmentation adopts traditional deep learning models that use Convolutional Neural Networks (CNN). At present, the transformer architecture has achieved great success in the field of natural images, but its application in the field of radar images is less studied. Therefore, this paper proposes a sea-land segmentation method based on the transformer architecture to strengthen edge supervision. It uses a self-attention mechanism with a gating strategy to better learn relative position bias. Meanwhile, an additional edge supervision branch is introduced. The decoder stage allows the feature information of the two branches to interact, thereby improving the edge precision of the sea-land segmentation. Based on the Gaofen-3 satellite image dataset, the experimental results show that the method proposed in this paper can effectively improve the accuracy of sea-land segmentation, especially the accuracy of sea-land edges. The mean IoU (Intersection over Union), edge precision, overall precision, and F1 scores respectively reach 96.36%, 84.54%, 99.74%, and 98.05%, which are superior to those of the mainstream segmentation models and have high practical application values.

Keywords: SAR, sea-land segmentation, deep learning, transformer

Procedia PDF Downloads 144
8994 Development of Scale in Evaluation of Effectiveness of Motivation of Divine Leadership

Authors: Parviz Abadi

Abstract:

Leadership is a key driver in organizational achievement. The research presented herein intends on providing the tools for assessing Divine Leadership, which imperative in quantitative evaluations of a leadership. The effectiveness of this leadership has never been examined. There are various tests that can be applied to this leadership, such as evaluation of it against follower motivation, or the impact it has on organizational success, etc. One of the common means of evaluation of a phenomenon is to conduct a quantitative study on the hypothesis related to the subject. The dimensions enacted in this leadership consisted of Humility, Integrity, Empowerment, Altruism, and Visionary. However, these elements of the construct of leadership are latent subjects and cannot easily be assessed. Therefore, it is necessary to develop tangible items that can relate to the construct. The study presented herein was conducted to develop the scales that were tangible and could have been applied in a quantitative study to assess this leadership. The study led to generating a detailed questionnaire, which consisted of 40 questions, that could be presented to participants in the survey.

Keywords: leadership, management, scale development, organizations

Procedia PDF Downloads 34
8993 Electroencephalogram Based Approach for Mental Stress Detection during Gameplay with Level Prediction

Authors: Priyadarsini Samal, Rajesh Singla

Abstract:

Many mobile games come with the benefits of entertainment by introducing stress to the human brain. In recognizing this mental stress, the brain-computer interface (BCI) plays an important role. It has various neuroimaging approaches which help in analyzing the brain signals. Electroencephalogram (EEG) is the most commonly used method among them as it is non-invasive, portable, and economical. Here, this paper investigates the pattern in brain signals when introduced with mental stress. Two healthy volunteers played a game whose aim was to search hidden words from the grid, and the levels were chosen randomly. The EEG signals during gameplay were recorded to investigate the impacts of stress with the changing levels from easy to medium to hard. A total of 16 features of EEG were analyzed for this experiment which includes power band features with relative powers, event-related desynchronization, along statistical features. Support vector machine was used as the classifier, which resulted in an accuracy of 93.9% for three-level stress analysis; for two levels, the accuracy of 92% and 98% are achieved. In addition to that, another game that was similar in nature was played by the volunteers. A suitable regression model was designed for prediction where the feature sets of the first and second game were used for testing and training purposes, respectively, and an accuracy of 73% was found.

Keywords: brain computer interface, electroencephalogram, regression model, stress, word search

Procedia PDF Downloads 166
8992 Efficient Schemes of Classifiers for Remote Sensing Satellite Imageries of Land Use Pattern Classifications

Authors: S. S. Patil, Sachidanand Kini

Abstract:

Classification of land use patterns is compelling in complexity and variability of remote sensing imageries data. An imperative research in remote sensing application exploited to mine some of the significant spatially variable factors as land cover and land use from satellite images for remote arid areas in Karnataka State, India. The diverse classification techniques, unsupervised and supervised consisting of maximum likelihood, Mahalanobis distance, and minimum distance are applied in Bellary District in Karnataka State, India for the classification of the raw satellite images. The accuracy evaluations of results are compared visually with the standard maps with ground-truths. We initiated with the maximum likelihood technique that gave the finest results and both minimum distance and Mahalanobis distance methods over valued agriculture land areas. In meanness of mislaid few irrelevant features due to the low resolution of the satellite images, high-quality accord between parameters extracted automatically from the developed maps and field observations was found.

Keywords: Mahalanobis distance, minimum distance, supervised, unsupervised, user classification accuracy, producer's classification accuracy, maximum likelihood, kappa coefficient

Procedia PDF Downloads 160
8991 Continuous Improvement of Teaching Quality through Course Evaluation by the Students

Authors: Valerie Follonier, Henrike Hamelmann, Jean-Michel Jullien

Abstract:

The Distance Learning University in Switzerland (UniDistance) is offering bachelor and master courses as well as further education programs. The professors and their assistants work at traditional Swiss universities and are giving their courses at UniDistance following a blended learning and flipped classroom approach. A standardized course evaluation by the students has been established as a component of a quality improvement process. The students’ feedback enables the stakeholders to identify areas of improvement, initiate professional development for the teaching teams and thus continuously augment the quality of instruction. This paper describes the evaluation process, the tools involved and how the approach involving all stakeholders helps forming a culture of quality in teaching. Additionally, it will present the first evaluation results following the new process. Two software tools have been developed to support all stakeholders in the process of the semi-annual formative evaluation. The first tool allows to create the survey and to assign it to the relevant courses and students. The second tool presents the results of the evaluation to the stakeholders, providing specific features for the teaching teams, the dean, the directorate and EDUDL+ (Educational development unit distance learning). The survey items were selected in accordance with the e-learning strategy of the institution and are formulated to support the professional development of the teaching teams. By reviewing the results the teaching teams become aware of the opinion of the students and are asked to write a feedback for the attention of their dean. The dean reviews the results of the faculty and writes a general report about the situation of the faculty and the possible improvements intended. Finally, EDUDL+ writes a final report summarising the evaluation results. A mechanism of adjustable warnings allows it to generate quality indicators for each module. These are summarised for each faculty and globally for the whole institution in order to increase the vigilance of the responsible. The quality process involves changing the indicators regularly to focus on different areas each semester, to facilitate the professional development of the teaching teams and to progressively augment the overall teaching quality of the institution.

Keywords: continuous improvement process, course evaluation, distance learning, software tools, teaching quality

Procedia PDF Downloads 241
8990 Reliability Analysis of Geometric Performance of Onboard Satellite Sensors: A Study on Location Accuracy

Authors: Ch. Sridevi, A. Chalapathi Rao, P. Srinivasulu

Abstract:

The location accuracy of data products is a critical parameter in assessing the geometric performance of satellite sensors. This study focuses on reliability analysis of onboard sensors to evaluate their performance in terms of location accuracy performance over time. The analysis utilizes field failure data and employs the weibull distribution to determine the reliability and in turn to understand the improvements or degradations over a period of time. The analysis begins by scrutinizing the location accuracy error which is the root mean square (RMS) error of differences between ground control point coordinates observed on the product and the map and identifying the failure data with reference to time. A significant challenge in this study is to thoroughly analyze the possibility of an infant mortality phase in the data. To address this, the Weibull distribution is utilized to determine if the data exhibits an infant stage or if it has transitioned into the operational phase. The shape parameter beta plays a crucial role in identifying this stage. Additionally, determining the exact start of the operational phase and the end of the infant stage poses another challenge as it is crucial to eliminate residual infant mortality or wear-out from the model, as it can significantly increase the total failure rate. To address this, an approach utilizing the well-established statistical Laplace test is applied to infer the behavior of sensors and to accurately ascertain the duration of different phases in the lifetime and the time required for stabilization. This approach also helps in understanding if the bathtub curve model, which accounts for the different phases in the lifetime of a product, is appropriate for the data and whether the thresholds for the infant period and wear-out phase are accurately estimated by validating the data in individual phases with Weibull distribution curve fitting analysis. Once the operational phase is determined, reliability is assessed using Weibull analysis. This analysis not only provides insights into the reliability of individual sensors with regards to location accuracy over the required period of time, but also establishes a model that can be applied to automate similar analyses for various sensors and parameters using field failure data. Furthermore, the identification of the best-performing sensor through this analysis serves as a benchmark for future missions and designs, ensuring continuous improvement in sensor performance and reliability. Overall, this study provides a methodology to accurately determine the duration of different phases in the life data of individual sensors. It enables an assessment of the time required for stabilization and provides insights into the reliability during the operational phase and the commencement of the wear-out phase. By employing this methodology, designers can make informed decisions regarding sensor performance with regards to location accuracy, contributing to enhanced accuracy in satellite-based applications.

Keywords: bathtub curve, geometric performance, Laplace test, location accuracy, reliability analysis, Weibull analysis

Procedia PDF Downloads 55
8989 Estimation of Train Operation Using an Exponential Smoothing Method

Authors: Taiyo Matsumura, Kuninori Takahashi, Takashi Ono

Abstract:

The purpose of this research is to improve the convenience of waiting for trains at level crossings and stations and to prevent accidents resulting from forcible entry into level crossings, by providing level crossing users and passengers with information that tells them when the next train will pass through or arrive. For this paper, we proposed methods for estimating operation by means of an average value method, variable response smoothing method, and exponential smoothing method, on the basis of open data, which has low accuracy, but for which performance schedules are distributed in real time. We then examined the accuracy of the estimations. The results showed that the application of an exponential smoothing method is valid.

Keywords: exponential smoothing method, open data, operation estimation, train schedule

Procedia PDF Downloads 366
8988 User Satisfaction Survey Based Facility Performance Evaluation

Authors: Gopikrishnan Seshadhri, V. M. Topkar

Abstract:

Facility management post occupation is a facet that has gained tremendous ground in the recent times. While the efficiency of expenditure and utilization of all types of resources are monitored to ensure timely completion with minimum cost and acceptable quality during construction phase, value for money comes out only when the facility performs satisfactorily post occupation, meeting aspirations and expectations of users of the facility. It is more so for the public facilities. Due to the paradigm shift in focus to outcome based performance evaluation, user satisfaction obtained mainly through questionnaires has become the single important criterion in performance evaluation. Questionnaires presently being used to gauge user satisfaction being subjective, the feedback obtained do not necessarily reflect actual performance. Hence, there is a requirement of developing a survey instrument that can gauge user satisfaction as objectively as possible and truly reflects the ground reality. A near correct picture of actual performance of the built facility from the user point of view will enable facility managers to address pertinent issues. This paper brings out the need for an effective survey instrument that will elicit more objective user response. It also lists steps involved in formulation of such an instrument.

Keywords: facility performance evaluation, attributes, attribute descriptors, user satisfaction surveys, statistical methods, performance indicators

Procedia PDF Downloads 264
8987 Fast and Accurate Finite-Difference Method Solving Multicomponent Smoluchowski Coagulation Equation

Authors: Alexander P. Smirnov, Sergey A. Matveev, Dmitry A. Zheltkov, Eugene E. Tyrtyshnikov

Abstract:

We propose a new computational technique for multidimensional (multicomponent) Smoluchowski coagulation equation. Using low-rank approximations in Tensor Train format of both the solution and the coagulation kernel, we accelerate the classical finite-difference Runge-Kutta scheme keeping its level of accuracy. The complexity of the taken finite-difference scheme is reduced from O(N^2d) to O(d^2 N log N ), where N is the number of grid nodes and d is a dimensionality of the problem. The efficiency and the accuracy of the new method are demonstrated on concrete problem with known analytical solution.

Keywords: tensor train decomposition, multicomponent Smoluchowski equation, runge-kutta scheme, convolution

Procedia PDF Downloads 411
8986 A Model Based Metaheuristic for Hybrid Hierarchical Community Structure in Social Networks

Authors: Radhia Toujani, Jalel Akaichi

Abstract:

In recent years, the study of community detection in social networks has received great attention. The hierarchical structure of the network leads to the emergence of the convergence to a locally optimal community structure. In this paper, we aim to avoid this local optimum in the introduced hybrid hierarchical method. To achieve this purpose, we present an objective function where we incorporate the value of structural and semantic similarity based modularity and a metaheuristic namely bees colonies algorithm to optimize our objective function on both hierarchical level divisive and agglomerative. In order to assess the efficiency and the accuracy of the introduced hybrid bee colony model, we perform an extensive experimental evaluation on both synthetic and real networks.

Keywords: social network, community detection, agglomerative hierarchical clustering, divisive hierarchical clustering, similarity, modularity, metaheuristic, bee colony

Procedia PDF Downloads 360
8985 The System for Root Canal Length Measurement Based on Multifrequency Impedance Method

Authors: Zheng Zhang, Xin Chen, Guoqing Ding

Abstract:

Electronic apex locators (EAL) has been widely used clinically for measuring root canal working length with high accuracy, which is crucial for successful endodontic treatment. In order to maintain high accuracy in different measurement environments, this study presented a system for root canal length measurement based on multifrequency impedance method. This measuring system can generate a sweep current with frequencies from 100 Hz to 1 MHz through a direct digital synthesizer. Multiple impedance ratios with different combinations of frequencies were obtained and transmitted by an analog-to-digital converter and several of them with representatives will be selected after data process. The system analyzed the functional relationship between these impedance ratios and the distance between the file and the apex with statistics by measuring plenty of teeth. The position of the apical foramen can be determined by the statistical model using these impedance ratios. The experimental results revealed that the accuracy of the system based on multifrequency impedance ratios method to determine the position of the apical foramen was higher than the dual-frequency impedance ratio method. Besides that, for more complex measurement environments, the performance of the system was more stable.

Keywords: root canal length, apex locator, multifrequency impedance, sweep frequency

Procedia PDF Downloads 138
8984 Feature Engineering Based Detection of Buffer Overflow Vulnerability in Source Code Using Deep Neural Networks

Authors: Mst Shapna Akter, Hossain Shahriar

Abstract:

One of the most important challenges in the field of software code audit is the presence of vulnerabilities in software source code. Every year, more and more software flaws are found, either internally in proprietary code or revealed publicly. These flaws are highly likely exploited and lead to system compromise, data leakage, or denial of service. C and C++ open-source code are now available in order to create a largescale, machine-learning system for function-level vulnerability identification. We assembled a sizable dataset of millions of opensource functions that point to potential exploits. We developed an efficient and scalable vulnerability detection method based on deep neural network models that learn features extracted from the source codes. The source code is first converted into a minimal intermediate representation to remove the pointless components and shorten the dependency. Moreover, we keep the semantic and syntactic information using state-of-the-art word embedding algorithms such as glove and fastText. The embedded vectors are subsequently fed into deep learning networks such as LSTM, BilSTM, LSTM-Autoencoder, word2vec, BERT, and GPT-2 to classify the possible vulnerabilities. Furthermore, we proposed a neural network model which can overcome issues associated with traditional neural networks. Evaluation metrics such as f1 score, precision, recall, accuracy, and total execution time have been used to measure the performance. We made a comparative analysis between results derived from features containing a minimal text representation and semantic and syntactic information. We found that all of the deep learning models provide comparatively higher accuracy when we use semantic and syntactic information as the features but require higher execution time as the word embedding the algorithm puts on a bit of complexity to the overall system.

Keywords: cyber security, vulnerability detection, neural networks, feature extraction

Procedia PDF Downloads 64
8983 Performance and Limitations of Likelihood Based Information Criteria and Leave-One-Out Cross-Validation Approximation Methods

Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer

Abstract:

Model assessment, in the Bayesian context, involves evaluation of the goodness-of-fit and the comparison of several alternative candidate models for predictive accuracy and improvements. In posterior predictive checks, the data simulated under the fitted model is compared with the actual data. Predictive model accuracy is estimated using information criteria such as the Akaike information criterion (AIC), the Bayesian information criterion (BIC), the Deviance information criterion (DIC), and the Watanabe-Akaike information criterion (WAIC). The goal of an information criterion is to obtain an unbiased measure of out-of-sample prediction error. Since posterior checks use the data twice; once for model estimation and once for testing, a bias correction which penalises the model complexity is incorporated in these criteria. Cross-validation (CV) is another method used for examining out-of-sample prediction accuracy. Leave-one-out cross-validation (LOO-CV) is the most computationally expensive variant among the other CV methods, as it fits as many models as the number of observations. Importance sampling (IS), truncated importance sampling (TIS) and Pareto-smoothed importance sampling (PSIS) are generally used as approximations to the exact LOO-CV and utilise the existing MCMC results avoiding expensive computational issues. The reciprocals of the predictive densities calculated over posterior draws for each observation are treated as the raw importance weights. These are in turn used to calculate the approximate LOO-CV of the observation as a weighted average of posterior densities. In IS-LOO, the raw weights are directly used. In contrast, the larger weights are replaced by their modified truncated weights in calculating TIS-LOO and PSIS-LOO. Although, information criteria and LOO-CV are unable to reflect the goodness-of-fit in absolute sense, the differences can be used to measure the relative performance of the models of interest. However, the use of these measures is only valid under specific circumstances. This study has developed 11 models using normal, log-normal, gamma, and student’s t distributions to improve the PCR stutter prediction with forensic data. These models are comprised of four with profile-wide variances, four with locus specific variances, and three which are two-component mixture models. The mean stutter ratio in each model is modeled as a locus specific simple linear regression against a feature of the alleles under study known as the longest uninterrupted sequence (LUS). The use of AIC, BIC, DIC, and WAIC in model comparison has some practical limitations. Even though, IS-LOO, TIS-LOO, and PSIS-LOO are considered to be approximations of the exact LOO-CV, the study observed some drastic deviations in the results. However, there are some interesting relationships among the logarithms of pointwise predictive densities (lppd) calculated under WAIC and the LOO approximation methods. The estimated overall lppd is a relative measure that reflects the overall goodness-of-fit of the model. Parallel log-likelihood profiles for the models conditional on equal posterior variances in lppds were observed. This study illustrates the limitations of the information criteria in practical model comparison problems. In addition, the relationships among LOO-CV approximation methods and WAIC with their limitations are discussed. Finally, useful recommendations that may help in practical model comparisons with these methods are provided.

Keywords: cross-validation, importance sampling, information criteria, predictive accuracy

Procedia PDF Downloads 371
8982 Learning at Workplace: Competences and Contexts in Sensory Evaluation

Authors: Ulriikka Savela-Huovinen, Hanni Muukkonen, Auli Toom

Abstract:

The development of workplace as a learning environment has been emphasized in research field of workplace learning. The prior literature on sensory performance emphasized the individual’s competences as assessor, while the competences in the collaborative interactional and knowledge creation practices as workplace learning method are not often mentioned. In the present study aims to find out what kinds of competences and contexts are central when assessor conducts food sensory evaluation in authentic professional context. The aim was to answer the following questions: first, what kinds of competences does sensory evaluation require according to assessors? And second, what kinds of contexts for sensory evaluation do assessors report? Altogether thirteen assessors from three Finnish food companies were interviewed by using semi-structural thematic interviews to map practices and development intentions as well as to explicate already established practices. The qualitative data were analyzed by following the principles of abductive and inductive content analysis. Analysis phases were combined and their results were considered together as a cross-analysis. When evaluated independently required competences were perception, knowledge of specific domains and methods and cognitive skills e.g. memory. Altogether, 42% of analysis units described individual evaluation contexts, 53% of analysis units described collaborative interactional contexts, and 5% of analysis units described collaborative knowledge creation contexts. Related to collaboration, analysis reviewed learning, sharing and reviewing both external and in-house consumer feedback, developing methods to moderate small-panel evaluation and developing product vocabulary collectively between the assessors. Knowledge creation contexts individualized from daily practices especially in cases product defects were sought and discussed. The study findings contribute to the explanation that sensory assessors learn extensively from one another in the collaborative interactional and knowledge creation context. Assessors learning and abilities to work collaboratively in the interactional and knowledge creation contexts need to be ensured in the development of the expertise.

Keywords: assessor, collaboration, competences, contexts, learning and practices, sensory evaluation

Procedia PDF Downloads 219
8981 Floor Response Spectra of RC Frames: Influence of the Infills on the Seismic Demand on Non-Structural Components

Authors: Gianni Blasi, Daniele Perrone, Maria Antonietta Aiello

Abstract:

The seismic vulnerability of non-structural components is nowadays recognized to be a key issue in performance-based earthquake engineering. Recent loss estimation studies, as well as the damage observed during past earthquakes, evidenced how non-structural damage represents the highest rate of economic loss in a building and can be in many cases crucial in a life-safety view during the post-earthquake emergency. The procedures developed to evaluate the seismic demand on non-structural components have been constantly improved and recent studies demonstrated how the existing formulations provided by main Standards generally ignore features which have a sensible influence on the definition of the seismic acceleration/displacements subjecting non-structural components. Since the influence of the infills on the dynamic behaviour of RC structures has already been evidenced by many authors, it is worth to be noted that the evaluation of the seismic demand on non-structural components should consider the presence of the infills as well as their mechanical properties. This study focuses on the evaluation of time-history floor acceleration in RC buildings; which is a useful mean to perform seismic vulnerability analyses of non-structural components through the well-known cascade method. Dynamic analyses are performed on an 8-storey RC frame, taking into account the presence of the infills; the influence of the elastic modulus of the panel on the results is investigated as well as the presence of openings. Floor accelerations obtained from the analyses are used to evaluate the floor response spectra, in order to define the demand on non-structural components depending on the properties of the infills. Finally, the results are compared with formulations provided by main International Standards, in order to assess the accuracy and eventually define the improvements required according to the results of the present research work.

Keywords: floor spectra, infilled RC frames, non-structural components, seismic demand

Procedia PDF Downloads 310
8980 Quality of Life Measurements: Evaluation of Intervention Program of Persons with Addiction

Authors: Julie Wittmannová, Petr Šeda

Abstract:

Quality of life measurements (QLF) help to evaluate interventions programs in different groups of persons with special needs. Our presentation deals with QLF of persons with addiction in relation to the physical activity (PA), type of addiction, age, gender and other variables. The aim of presentation is to summarize the basic findings and offer thoughts for questions arose. Methods: SQUALA (Subjective Quality of Life Analysis); SEIQoL (Schedule for the Evaluation of Individual Quality of Life); questionnaire of own construction. The results are evaluated by Mann­Whitney U test and Kruskall­Wallis ANOVA test (p ≤ 0,05). Sample of 64 participants – clients of aftercare center, aged 18 plus. Findings: Application of the methods SQUALA and SEIQoL in the chosen population seems appropriate, the obtaining information regarding the QLF correlate to intervention program topics, the need of an activelifestyle and health related topics in persons with addiction is visible. Conclusions or Implications: The subjective evaluation of quality of life of Aftercare clients is an important part of evaluation process, especially used to evaluate satisfaction with offered services and programs. Techniques SQUALA and SEIQoL gave us the desired outcomes.

Keywords: adapted physical activity, addiction, quality of life, physical activity, aftercare

Procedia PDF Downloads 306
8979 MB-Slam: A Slam Framework for Construction Monitoring

Authors: Mojtaba Noghabaei, Khashayar Asadi, Kevin Han

Abstract:

Simultaneous Localization and Mapping (SLAM) technology has recently attracted the attention of construction companies for real-time performance monitoring. To effectively use SLAM for construction performance monitoring, SLAM results should be registered to a Building Information Models (BIM). Registring SLAM and BIM can provide essential insights for construction managers to identify construction deficiencies in real-time and ultimately reduce rework. Also, registering SLAM to BIM in real-time can boost the accuracy of SLAM since SLAM can use features from both images and 3d models. However, registering SLAM with the BIM in real-time is a challenge. In this study, a novel SLAM platform named Model-Based SLAM (MB-SLAM) is proposed, which not only provides automated registration of SLAM and BIM but also improves the localization accuracy of the SLAM system in real-time. This framework improves the accuracy of SLAM by aligning perspective features such as depth, vanishing points, and vanishing lines from the BIM to the SLAM system. This framework extracts depth features from a monocular camera’s image and improves the localization accuracy of the SLAM system through a real-time iterative process. Initially, SLAM can be used to calculate a rough camera pose for each keyframe. In the next step, each SLAM video sequence keyframe is registered to the BIM in real-time by aligning the keyframe’s perspective with the equivalent BIM view. The alignment method is based on perspective detection that estimates vanishing lines and points by detecting straight edges on images. This process will generate the associated BIM views from the keyframes' views. The calculated poses are later improved during a real-time gradient descent-based iteration method. Two case studies were presented to validate MB-SLAM. The validation process demonstrated promising results and accurately registered SLAM to BIM and significantly improved the SLAM’s localization accuracy. Besides, MB-SLAM achieved real-time performance in both indoor and outdoor environments. The proposed method can fully automate past studies and generate as-built models that are aligned with BIM. The main contribution of this study is a SLAM framework for both research and commercial usage, which aims to monitor construction progress and performance in a unified framework. Through this platform, users can improve the accuracy of the SLAM by providing a rough 3D model of the environment. MB-SLAM further boosts the application to practical usage of the SLAM.

Keywords: perspective alignment, progress monitoring, slam, stereo matching.

Procedia PDF Downloads 197
8978 Questionnaire for the Evaluation of Entrepreneurship Project Psychopedagogical Practices: Construction Proceedings and Validation

Authors: Cristina Costa-Lobo, Sandra Fernandes, Miguel Magalhães, José Dinis-Carvalho, Alfredo Regueiro, Ana Carvalho

Abstract:

This paper is a report on the findings of the construction and the validation of a questionnaire monetized in a portuguese higher education context with undergraduate students. The Questionnaire for the Evaluation of Entrepreneurship Project Psychopedagogical Practices consists of six scales: Critical appraisal of the project, Developed Learning and Skills, Teamwork, Teacher and Tutor Roles, Evaluation of Student Performance, and Project Effectiveness as a Teaching-Learning Methodology. The proceedings of its construction are analyzed, and the validity and internal consistency analysis are described. Findings indicate good indicators of validity, good fidelity and an interpretable factorial structure.

Keywords: entrepreneurship project, higher education, psychopedagogical practices, teacher and tutor roles

Procedia PDF Downloads 360
8977 A Closed-Loop Design Model for Sustainable Manufacturing by Integrating Forward Design and Reverse Design

Authors: Yuan-Jye Tseng, Yi-Shiuan Chen

Abstract:

In this paper, a new concept of closed-loop design model is presented. The closed-loop design model is developed by integrating forward design and reverse design. Based on this new concept, a closed-loop design model for sustainable manufacturing by integrated evaluation of forward design, reverse design, and green manufacturing using a fuzzy analytic network process is developed. In the design stage of a product, with a given product requirement and objective, there can be different ways to design the detailed components and specifications. Therefore, there can be different design cases to achieve the same product requirement and objective. Thus, in the design evaluation stage, it is required to analyze and evaluate the different design cases. The purpose of this research is to develop a model for evaluating the design cases by integrated evaluation of forward design, reverse design, and green manufacturing models. A fuzzy analytic network process model is presented for integrated evaluation of the criteria in the three models. The comparison matrices for evaluating the criteria in the three groups are established. The total relational values among the three groups represent the total relational effects. In application, a super matrix can be created and the total relational values can be used to evaluate the design cases for decision-making to select the final design case. An example product is demonstrated in this presentation. It shows that the model is useful for integrated evaluation of forward design, reverse design, and green manufacturing to achieve a closed-loop design for sustainable manufacturing objective.

Keywords: design evaluation, forward design, reverse design, closed-loop design, supply chain management, closed-loop supply chain, fuzzy analytic network process

Procedia PDF Downloads 652
8976 Facial Emotion Recognition Using Deep Learning

Authors: Ashutosh Mishra, Nikhil Goyal

Abstract:

A 3D facial emotion recognition model based on deep learning is proposed in this paper. Two convolution layers and a pooling layer are employed in the deep learning architecture. After the convolution process, the pooling is finished. The probabilities for various classes of human faces are calculated using the sigmoid activation function. To verify the efficiency of deep learning-based systems, a set of faces. The Kaggle dataset is used to verify the accuracy of a deep learning-based face recognition model. The model's accuracy is about 65 percent, which is lower than that of other facial expression recognition techniques. Despite significant gains in representation precision due to the nonlinearity of profound image representations.

Keywords: facial recognition, computational intelligence, convolutional neural network, depth map

Procedia PDF Downloads 203
8975 Morphological Evaluation of Mesenchymal Stem Cells Derived from Adipose Tissue of Dog Treated with Different Concentrations of Nano-Hydroxy Apatite

Authors: K. Barbaro, F. Di Egidio, A. Amaddeo, G. Lupoli, S. Eramo, G. Barraco, D. Amaddeo, C. Gallottini

Abstract:

In this study, we wanted to evaluate the effects of nano-hydroxy apatite (NHA) on mesenchymal stem cells extracted from subcutaneous adipose tissue of the dog. The stem cells were divided into 6 experimental groups at different concentrations of NHA. The comparison was made with a control group of stem cell grown in standard conditions without NHA. After 1 week, the cells were fixed with 10% buffered formalin for 1 hour at room temperature and stained with Giemsa, measured at the inverted optical microscope. The morphological evaluation of the control samples and those treated showed that stem cells adhere to the substrate and proliferate in the presence of nanohydroxy apatite at different concentrations showing no detectable toxic effects.

Keywords: nano-hydroxy apatite, adipose mesenchymal stem cells, dog, morphological evaluation

Procedia PDF Downloads 453
8974 Predicting the Diagnosis of Alzheimer’s Disease: Development and Validation of Machine Learning Models

Authors: Jay L. Fu

Abstract:

Patients with Alzheimer's disease progressively lose their memory and thinking skills and, eventually, the ability to carry out simple daily tasks. The disease is irreversible, but early detection and treatment can slow down the disease progression. In this research, publicly available MRI data and demographic data from 373 MRI imaging sessions were utilized to build models to predict dementia. Various machine learning models, including logistic regression, k-nearest neighbor, support vector machine, random forest, and neural network, were developed. Data were divided into training and testing sets, where training sets were used to build the predictive model, and testing sets were used to assess the accuracy of prediction. Key risk factors were identified, and various models were compared to come forward with the best prediction model. Among these models, the random forest model appeared to be the best model with an accuracy of 90.34%. MMSE, nWBV, and gender were the three most important contributing factors to the detection of Alzheimer’s. Among all the models used, the percent in which at least 4 of the 5 models shared the same diagnosis for a testing input was 90.42%. These machine learning models allow early detection of Alzheimer’s with good accuracy, which ultimately leads to early treatment of these patients.

Keywords: Alzheimer's disease, clinical diagnosis, magnetic resonance imaging, machine learning prediction

Procedia PDF Downloads 122
8973 On Phase Based Stereo Matching and Its Related Issues

Authors: András Rövid, Takeshi Hashimoto

Abstract:

The paper focuses on the problem of the point correspondence matching in stereo images. The proposed matching algorithm is based on the combination of simpler methods such as normalized sum of squared differences (NSSD) and a more complex phase correlation based approach, by considering the noise and other factors, as well. The speed of NSSD and the preciseness of the phase correlation together yield an efficient approach to find the best candidate point with sub-pixel accuracy in stereo image pairs. The task of the NSSD in this case is to approach the candidate pixel roughly. Afterwards the location of the candidate is refined by an enhanced phase correlation based method which in contrast to the NSSD has to run only once for each selected pixel.

Keywords: stereo matching, sub-pixel accuracy, phase correlation, SVD, NSSD

Procedia PDF Downloads 445
8972 A Unified Fitting Method for the Set of Unified Constitutive Equations for Modelling Microstructure Evolution in Hot Deformation

Authors: Chi Zhang, Jun Jiang

Abstract:

Constitutive equations are very important in finite element (FE) modeling, and the accuracy of the material constants in the equations have significant effects on the accuracy of the FE models. A wide range of constitutive equations are available; however, fitting the material constants in the constitutive equations could be complex and time-consuming due to the strong non-linearity and relationship between the constants. This work will focus on the development of a set of unified MATLAB programs for fitting the material constants in the constitutive equations efficiently. Users will only need to supply experimental data in the required format and run the program without modifying functions or precisely guessing the initial values, or finding the parameters in previous works and will be able to fit the material constants efficiently.

Keywords: constitutive equations, FE modelling, MATLAB program, non-linear curve fitting

Procedia PDF Downloads 75
8971 Evaluation of Competency Training Effectiveness in Chosen Sales Departments

Authors: L. Pigon, S. Kot, J. K. Grabara

Abstract:

Nowadays, with organizations facing the challenges of increasing competitiveness, human capital accumulated by the organization is one of the elements that strongly differentiate between companies. Efficient management in the competition area requires to manage the competencies of their employees to be suitable to the market fluctuations. The aim of the paper was to determine how employee training to improve their competencies is verified. The survey was conducted among 37 respondents involved in selection of training providers and training programs in their enterprises. The results showed that all organizations use training survey as a basic method for evaluation of training effectiveness. Depending on the training contents and organization, the questionnaires contain various questions. Most of these surveys are composed of the three basic blocks: the trainer's assessment, the evaluation of the training contents, the assessment of the materials and the place of the organisation. None of the organization surveys conducted regular job-related observations or examined the attitudes of the training participants.

Keywords: human capital, competencies, training effectiveness, sale department

Procedia PDF Downloads 153