Search results for: accuracy improvement
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7872

Search results for: accuracy improvement

7812 Spectrogram Pre-Processing to Improve Isotopic Identification to Discriminate Gamma and Neutrons Sources

Authors: Mustafa Alhamdi

Abstract:

Industrial application to classify gamma rays and neutron events is investigated in this study using deep machine learning. The identification using a convolutional neural network and recursive neural network showed a significant improvement in predication accuracy in a variety of applications. The ability to identify the isotope type and activity from spectral information depends on feature extraction methods, followed by classification. The features extracted from the spectrum profiles try to find patterns and relationships to present the actual spectrum energy in low dimensional space. Increasing the level of separation between classes in feature space improves the possibility to enhance classification accuracy. The nonlinear nature to extract features by neural network contains a variety of transformation and mathematical optimization, while principal component analysis depends on linear transformations to extract features and subsequently improve the classification accuracy. In this paper, the isotope spectrum information has been preprocessed by finding the frequencies components relative to time and using them as a training dataset. Fourier transform implementation to extract frequencies component has been optimized by a suitable windowing function. Training and validation samples of different isotope profiles interacted with CdTe crystal have been simulated using Geant4. The readout electronic noise has been simulated by optimizing the mean and variance of normal distribution. Ensemble learning by combing voting of many models managed to improve the classification accuracy of neural networks. The ability to discriminate gamma and neutron events in a single predication approach using deep machine learning has shown high accuracy using deep learning. The paper findings show the ability to improve the classification accuracy by applying the spectrogram preprocessing stage to the gamma and neutron spectrums of different isotopes. Tuning deep machine learning models by hyperparameter optimization of neural network models enhanced the separation in the latent space and provided the ability to extend the number of detected isotopes in the training database. Ensemble learning contributed significantly to improve the final prediction.

Keywords: machine learning, nuclear physics, Monte Carlo simulation, noise estimation, feature extraction, classification

Procedia PDF Downloads 151
7811 Systematic Evaluation of Convolutional Neural Network on Land Cover Classification from Remotely Sensed Images

Authors: Eiman Kattan, Hong Wei

Abstract:

In using Convolutional Neural Network (CNN) for classification, there is a set of hyperparameters available for the configuration purpose. This study aims to evaluate the impact of a range of parameters in CNN architecture i.e. AlexNet on land cover classification based on four remotely sensed datasets. The evaluation tests the influence of a set of hyperparameters on the classification performance. The parameters concerned are epoch values, batch size, and convolutional filter size against input image size. Thus, a set of experiments were conducted to specify the effectiveness of the selected parameters using two implementing approaches, named pertained and fine-tuned. We first explore the number of epochs under several selected batch size values (32, 64, 128 and 200). The impact of kernel size of convolutional filters (1, 3, 5, 7, 10, 15, 20, 25 and 30) was evaluated against the image size under testing (64, 96, 128, 180 and 224), which gave us insight of the relationship between the size of convolutional filters and image size. To generalise the validation, four remote sensing datasets, AID, RSD, UCMerced and RSCCN, which have different land covers and are publicly available, were used in the experiments. These datasets have a wide diversity of input data, such as number of classes, amount of labelled data, and texture patterns. A specifically designed interactive deep learning GPU training platform for image classification (Nvidia Digit) was employed in the experiments. It has shown efficiency in both training and testing. The results have shown that increasing the number of epochs leads to a higher accuracy rate, as expected. However, the convergence state is highly related to datasets. For the batch size evaluation, it has shown that a larger batch size slightly decreases the classification accuracy compared to a small batch size. For example, selecting the value 32 as the batch size on the RSCCN dataset achieves the accuracy rate of 90.34 % at the 11th epoch while decreasing the epoch value to one makes the accuracy rate drop to 74%. On the other extreme, setting an increased value of batch size to 200 decreases the accuracy rate at the 11th epoch is 86.5%, and 63% when using one epoch only. On the other hand, selecting the kernel size is loosely related to data set. From a practical point of view, the filter size 20 produces 70.4286%. The last performed image size experiment shows a dependency in the accuracy improvement. However, an expensive performance gain had been noticed. The represented conclusion opens the opportunities toward a better classification performance in various applications such as planetary remote sensing.

Keywords: CNNs, hyperparamters, remote sensing, land cover, land use

Procedia PDF Downloads 170
7810 Using Machine Learning to Classify Different Body Parts and Determine Healthiness

Authors: Zachary Pan

Abstract:

Our general mission is to solve the problem of classifying images into different body part types and deciding if each of them is healthy or not. However, for now, we will determine healthiness for only one-sixth of the body parts, specifically the chest. We will detect pneumonia in X-ray scans of those chest images. With this type of AI, doctors can use it as a second opinion when they are taking CT or X-ray scans of their patients. Another ad-vantage of using this machine learning classifier is that it has no human weaknesses like fatigue. The overall ap-proach to this problem is to split the problem into two parts: first, classify the image, then determine if it is healthy. In order to classify the image into a specific body part class, the body parts dataset must be split into test and training sets. We can then use many models, like neural networks or logistic regression models, and fit them using the training set. Now, using the test set, we can obtain a realistic accuracy the models will have on images in the real world since these testing images have never been seen by the models before. In order to increase this testing accuracy, we can also apply many complex algorithms to the models, like multiplicative weight update. For the second part of the problem, to determine if the body part is healthy, we can have another dataset consisting of healthy and non-healthy images of the specific body part and once again split that into the test and training sets. We then use another neural network to train on those training set images and use the testing set to figure out its accuracy. We will do this process only for the chest images. A major conclusion reached is that convolutional neural networks are the most reliable and accurate at image classification. In classifying the images, the logistic regression model, the neural network, neural networks with multiplicative weight update, neural networks with the black box algorithm, and the convolutional neural network achieved 96.83 percent accuracy, 97.33 percent accuracy, 97.83 percent accuracy, 96.67 percent accuracy, and 98.83 percent accuracy, respectively. On the other hand, the overall accuracy of the model that de-termines if the images are healthy or not is around 78.37 percent accuracy.

Keywords: body part, healthcare, machine learning, neural networks

Procedia PDF Downloads 109
7809 Reliability Analysis of Geometric Performance of Onboard Satellite Sensors: A Study on Location Accuracy

Authors: Ch. Sridevi, A. Chalapathi Rao, P. Srinivasulu

Abstract:

The location accuracy of data products is a critical parameter in assessing the geometric performance of satellite sensors. This study focuses on reliability analysis of onboard sensors to evaluate their performance in terms of location accuracy performance over time. The analysis utilizes field failure data and employs the weibull distribution to determine the reliability and in turn to understand the improvements or degradations over a period of time. The analysis begins by scrutinizing the location accuracy error which is the root mean square (RMS) error of differences between ground control point coordinates observed on the product and the map and identifying the failure data with reference to time. A significant challenge in this study is to thoroughly analyze the possibility of an infant mortality phase in the data. To address this, the Weibull distribution is utilized to determine if the data exhibits an infant stage or if it has transitioned into the operational phase. The shape parameter beta plays a crucial role in identifying this stage. Additionally, determining the exact start of the operational phase and the end of the infant stage poses another challenge as it is crucial to eliminate residual infant mortality or wear-out from the model, as it can significantly increase the total failure rate. To address this, an approach utilizing the well-established statistical Laplace test is applied to infer the behavior of sensors and to accurately ascertain the duration of different phases in the lifetime and the time required for stabilization. This approach also helps in understanding if the bathtub curve model, which accounts for the different phases in the lifetime of a product, is appropriate for the data and whether the thresholds for the infant period and wear-out phase are accurately estimated by validating the data in individual phases with Weibull distribution curve fitting analysis. Once the operational phase is determined, reliability is assessed using Weibull analysis. This analysis not only provides insights into the reliability of individual sensors with regards to location accuracy over the required period of time, but also establishes a model that can be applied to automate similar analyses for various sensors and parameters using field failure data. Furthermore, the identification of the best-performing sensor through this analysis serves as a benchmark for future missions and designs, ensuring continuous improvement in sensor performance and reliability. Overall, this study provides a methodology to accurately determine the duration of different phases in the life data of individual sensors. It enables an assessment of the time required for stabilization and provides insights into the reliability during the operational phase and the commencement of the wear-out phase. By employing this methodology, designers can make informed decisions regarding sensor performance with regards to location accuracy, contributing to enhanced accuracy in satellite-based applications.

Keywords: bathtub curve, geometric performance, Laplace test, location accuracy, reliability analysis, Weibull analysis

Procedia PDF Downloads 65
7808 The Impact of Human Rights Violation in Modern Society

Authors: Hanania Nasan Shokry Abdelmasih

Abstract:

The interface between improvement and human rights has long been the subject of scholarly debate. As an end result, a hard and fast of principles, starting from the proper improvement to a human rights-based totally technique to development, have been adopted to understand the dynamics among the two concepts. In spite of those attempts, the precise link between development and human rights is not yet fully understood. However, the inevitable interdependence between the two standards and the idea that development efforts must be made while respecting human rights have received prominence in recent years. Then again, the emergence of sustainable development as a widely spread method in development dreams and rules similarly complicates this unresolved convergence. The place of sustainable improvement inside the human rights discourse and its role in ensuring the sustainability of improvement programs require systematic research. The purpose of this newsletter is, therefore, to take a look at the relationship between development and human rights, with particular attention to the area of the standards of sustainable improvement in international human rights regulation. It's going to examine whether it recognizes the proper to achieve sustainable improvement. Hence, the Article states that the principles of sustainable improvement are diagnosed immediately or implicitly in numerous human rights devices, which is an affirmative solution to the question posed above. Therefore, this report scrutinizes worldwide and local human rights gadgets, as well as the case regulation and interpretations of human rights in our bodies, to support this speculation.

Keywords: sustainable development, human rights, the right to development, the human rights-based approach to development, environmental rights, economic development, social sustainability human rights protection, human rights violations, workers’ rights, justice, security.

Procedia PDF Downloads 36
7807 Discussion as a Means to Improve Peer Assessment Accuracy

Authors: Jung Ae Park, Jooyong Park

Abstract:

Writing is an important learning activity that cultivates higher level thinking. Effective and immediate feedback is necessary to help improve students' writing skills. Peer assessment can be an effective method in writing tasks because it makes it possible for students not only to receive quick feedback on their writing but also to get a chance to examine different perspectives on the same topic. Peer assessment can be practiced frequently and has the advantage of immediate feedback. However, there is controversy about the accuracy of peer assessment. In this study, we tried to demonstrate experimentally how the accuracy of peer assessment could be improved. Participants (n=76) were randomly assigned to groups of 4 members. All the participant graded two sets of 4 essays on the same topic. They graded the first set twice, and the second set or the posttest once. After the first grading of the first set, each group in the experimental condition 1 (discussion group), were asked to discuss the results of the peer assessment and then to grade the essays again. Each group in the experimental condition 2 (reading group), were asked to read the assessment on each essay by an expert and then to grade the essays again. In the control group, the participants were asked to grade the 4 essays twice in different orders. Afterwards, all the participants graded the second set of 4 essays. The mean score from 4 participants was calculated for each essay. The accuracy of the peer assessment was measured by Pearson correlation with the scores of the expert. The results were analyzed by two-way repeated measure ANOVA. The main effect of grading was observed: Grading accuracy got better as the number of grading experience increased. Analysis of posttest accuracy revealed that the score variations within a group of 4 participants decreased in both discussion and reading conditions but not in the control condition. These results suggest that having students discuss their grading together can be an efficient means to improve peer assessment accuracy. By discussing, students can learn from others about what to consider in grading and whether their grading is too strict or lenient. Further research is needed to examine the exact cause of the grading accuracy.

Keywords: peer assessment, evaluation accuracy, discussion, score variations

Procedia PDF Downloads 267
7806 Evaluation of Ensemble Classifiers for Intrusion Detection

Authors: M. Govindarajan

Abstract:

One of the major developments in machine learning in the past decade is the ensemble method, which finds highly accurate classifier by combining many moderately accurate component classifiers. In this research work, new ensemble classification methods are proposed with homogeneous ensemble classifier using bagging and heterogeneous ensemble classifier using arcing and their performances are analyzed in terms of accuracy. A Classifier ensemble is designed using Radial Basis Function (RBF) and Support Vector Machine (SVM) as base classifiers. The feasibility and the benefits of the proposed approaches are demonstrated by the means of standard datasets of intrusion detection. The main originality of the proposed approach is based on three main parts: preprocessing phase, classification phase, and combining phase. A wide range of comparative experiments is conducted for standard datasets of intrusion detection. The performance of the proposed homogeneous and heterogeneous ensemble classifiers are compared to the performance of other standard homogeneous and heterogeneous ensemble methods. The standard homogeneous ensemble methods include Error correcting output codes, Dagging and heterogeneous ensemble methods include majority voting, stacking. The proposed ensemble methods provide significant improvement of accuracy compared to individual classifiers and the proposed bagged RBF and SVM performs significantly better than ECOC and Dagging and the proposed hybrid RBF-SVM performs significantly better than voting and stacking. Also heterogeneous models exhibit better results than homogeneous models for standard datasets of intrusion detection. 

Keywords: data mining, ensemble, radial basis function, support vector machine, accuracy

Procedia PDF Downloads 249
7805 Software-Defined Architecture and Front-End Optimization for DO-178B Compliant Distance Measuring Equipment

Authors: Farzan Farhangian, Behnam Shakibafar, Bobda Cedric, Rene Jr. Landry

Abstract:

Among the air navigation technologies, many of them are capable of increasing aviation sustainability as well as accuracy improvement in Alternative Positioning, Navigation, and Timing (APNT), especially avionics Distance Measuring Equipment (DME), Very high-frequency Omni-directional Range (VOR), etc. The integration of these air navigation solutions could make a robust and efficient accuracy in air mobility, air traffic management and autonomous operations. Designing a proper RF front-end, power amplifier and software-defined transponder could pave the way for reaching an optimized avionics navigation solution. In this article, the possibility of reaching an optimum front-end to be used with single low-cost Software-Defined Radio (SDR) has been investigated in order to reach a software-defined DME architecture. Our software-defined approach uses the firmware possibilities to design a real-time software architecture compatible with a Multi Input Multi Output (MIMO) BladeRF to estimate an accurate time delay between a Transmission (Tx) and the reception (Rx) channels using the synchronous scheduled communication. We could design a novel power amplifier for the transmission channel of the DME to pass the minimum transmission power. This article also investigates designing proper pair pulses based on the DO-178B avionics standard. Various guidelines have been tested, and the possibility of passing the certification process for each standard term has been analyzed. Finally, the performance of the DME was tested in the laboratory environment using an IFR6000, which showed that the proposed architecture reached an accuracy of less than 0.23 Nautical mile (Nmi) with 98% probability.

Keywords: avionics, DME, software defined radio, navigation

Procedia PDF Downloads 80
7804 Variable-Fidelity Surrogate Modelling with Kriging

Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans

Abstract:

Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.

Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients

Procedia PDF Downloads 558
7803 Ground Improvement with Basal Reinforcement with High Strength Geogrids and PVDs for Embankment over Soft Soils

Authors: Ratnakar Mahajan, Matteo Lelli, Kinjal Parmar

Abstract:

Ground improvement is a very important aspect of infrastructure development, especially when it comes to deep-ground improvement. The use of various geosynthetic applications is very common these days for ground improvement. This paper presents a case study where the combination of two geosynthetic applications was used in order to optimize the design as well as to control the settlements through uniform load distribution. The Agartala-Akaura rail project was made to help increase railway connectivity between India and Bangladesh. Both countries have started the construction of the same. The project requires high railway embankments to be built for the rail link. However, the challenge was to design a proper ground improvement solution as the entire area comprises very soft soil for an average depth of 15m. After due diligence, a combination of two methods was worked out by Maccaferri. PVDs were provided for the consolidation, and on top of that, a layer of high-strength geogrids (Paralink) was proposed as a basal reinforcement. The design approach was followed as described in Indian standards as well as British standards. By introducing a basal reinforcement, the spacing of PVDs could be increased, which allowed quick installation and less material consumption while keeping the consolidation time within the project duration.

Keywords: ground improvement, basal reinforcement, PVDs, high strength geogrids, Paralink

Procedia PDF Downloads 75
7802 Factors Impacting Geostatistical Modeling Accuracy and Modeling Strategy of Fluvial Facies Models

Authors: Benbiao Song, Yan Gao, Zhuo Liu

Abstract:

Geostatistical modeling is the key technic for reservoir characterization, the quality of geological models will influence the prediction of reservoir performance greatly, but few studies have been done to quantify the factors impacting geostatistical reservoir modeling accuracy. In this study, 16 fluvial prototype models have been established to represent different geological complexity, 6 cases range from 16 to 361 wells were defined to reproduce all those 16 prototype models by different methodologies including SIS, object-based and MPFS algorithms accompany with different constraint parameters. Modeling accuracy ratio was defined to quantify the influence of each factor, and ten realizations were averaged to represent each accuracy ratio under the same modeling condition and parameters association. Totally 5760 simulations were done to quantify the relative contribution of each factor to the simulation accuracy, and the results can be used as strategy guide for facies modeling in the similar condition. It is founded that data density, geological trend and geological complexity have great impact on modeling accuracy. Modeling accuracy may up to 90% when channel sand width reaches up to 1.5 times of well space under whatever condition by SIS and MPFS methods. When well density is low, the contribution of geological trend may increase the modeling accuracy from 40% to 70%, while the use of proper variogram may have very limited contribution for SIS method. It can be implied that when well data are dense enough to cover simple geobodies, few efforts were needed to construct an acceptable model, when geobodies are complex with insufficient data group, it is better to construct a set of robust geological trend than rely on a reliable variogram function. For object-based method, the modeling accuracy does not increase obviously as SIS method by the increase of data density, but kept rational appearance when data density is low. MPFS methods have the similar trend with SIS method, but the use of proper geological trend accompany with rational variogram may have better modeling accuracy than MPFS method. It implies that the geological modeling strategy for a real reservoir case needs to be optimized by evaluation of dataset, geological complexity, geological constraint information and the modeling objective.

Keywords: fluvial facies, geostatistics, geological trend, modeling strategy, modeling accuracy, variogram

Procedia PDF Downloads 264
7801 RNA Interference Technology as a Veritable Tool for Crop Improvement and Breeding for Biotic Stress Resistance

Authors: M. Yusuf

Abstract:

The recent discovery of the phenomenon of RNA interference has led to its application in various aspects of plant improvement. Crops can be modified by engineering novel RNA interference pathways that create small RNA molecules to alter gene expression in crops or plant pests. RNA interference can generate new crop quality traits or provide protection against insects, nematodes and pathogens without introducing new proteins into food and feed products. This is an advantage in contrast with conventional procedures of gene transfer. RNA interference has been used to develop crop varieties resistant to diseases, pathogens and insects. Male sterility has been engineered in plants using RNA interference. Better quality crops have been developed through the application of RNA interference etc. The objective of this paper is to highlight the application of RNA interference in crop improvement and to project its potential future use to solve problems of agricultural production in relation to plant breeding.

Keywords: RNA interference, application, crop Improvement, agricultural production

Procedia PDF Downloads 427
7800 One-Shot Text Classification with Multilingual-BERT

Authors: Hsin-Yang Wang, K. M. A. Salam, Ying-Jia Lin, Daniel Tan, Tzu-Hsuan Chou, Hung-Yu Kao

Abstract:

Detecting user intent from natural language expression has a wide variety of use cases in different natural language processing applications. Recently few-shot training has a spike of usage on commercial domains. Due to the lack of significant sample features, the downstream task performance has been limited or leads to an unstable result across different domains. As a state-of-the-art method, the pre-trained BERT model gathering the sentence-level information from a large text corpus shows improvement on several NLP benchmarks. In this research, we are proposing a method to change multi-class classification tasks into binary classification tasks, then use the confidence score to rank the results. As a language model, BERT performs well on sequence data. In our experiment, we change the objective from predicting labels into finding the relations between words in sequence data. Our proposed method achieved 71.0% accuracy in the internal intent detection dataset and 63.9% accuracy in the HuffPost dataset. Acknowledgment: This work was supported by NCKU-B109-K003, which is the collaboration between National Cheng Kung University, Taiwan, and SoftBank Corp., Tokyo.

Keywords: OSML, BERT, text classification, one shot

Procedia PDF Downloads 101
7799 Enhancing Patient Outcomes Through Quality Improvement: Reducing Contamination Rates in Karyotyping Samples via Effective Audits and Staff Engagement

Authors: Rofaida Ashour

Abstract:

This study discusses the implementation of quality improvement initiatives aimed at reducing contamination rates in cultured karyotyping samples. The primary objective was to enhance patient outcomes through systematic audits and targeted staff engagement. Recognizing the critical impact of sample integrity on diagnostic accuracy, a thorough analysis was conducted to identify the root causes of contamination. The project involved two audit cycles, which facilitated a comprehensive assessment of adherence to local protocols. Key issues identified included lapses in the use of personal protective equipment (PPE) and inadequate awareness of proper sample handling procedures among staff. To address these challenges, a multi-faceted approach was adopted. Firstly, a presentation was delivered to the laboratory team emphasizing the significance of strict adherence to PPE guidelines during the collection and handling of samples. This session aimed to raise awareness and foster a culture of safety within the unit. Additionally, informative posters illustrating the correct procedures were strategically placed around the laboratory to serve as ongoing visual reminders for staff. Recognizing the heightened risk associated with patients exhibiting fever or signs of infection, special measures were introduced to manage their sample collection. These proactive strategies were designed to minimize the likelihood of introducing contaminated samples into the culture process. The results of the audits demonstrated a significant reduction in contamination rates, underscoring the effectiveness of the interventions. This experience reinforced the importance of continuous quality improvement in healthcare settings, particularly in ensuring the delivery of high-quality, safe, and efficient services. Conducting regular audits not only provided valuable insights into operational practices but also highlighted the critical role of active team engagement and a data-driven approach in decision-making. Effective communication and collaboration among team members emerged as essential components for the success of quality improvement initiatives.

Keywords: quality improvement, contamination rates, karyotyping samples, healthcare protocols, staff engagement

Procedia PDF Downloads 15
7798 A Conv-Long Short-term Memory Deep Learning Model for Traffic Flow Prediction

Authors: Ali Reza Sattarzadeh, Ronny J. Kutadinata, Pubudu N. Pathirana, Van Thanh Huynh

Abstract:

Traffic congestion has become a severe worldwide problem, affecting everyday life, fuel consumption, time, and air pollution. The primary causes of these issues are inadequate transportation infrastructure, poor traffic signal management, and rising population. Traffic flow forecasting is one of the essential and effective methods in urban congestion and traffic management, which has attracted the attention of researchers. With the development of technology, undeniable progress has been achieved in existing methods. However, there is a possibility of improvement in the extraction of temporal and spatial features to determine the importance of traffic flow sequences and extraction features. In the proposed model, we implement the convolutional neural network (CNN) and long short-term memory (LSTM) deep learning models for mining nonlinear correlations and their effectiveness in increasing the accuracy of traffic flow prediction in the real dataset. According to the experiments, the results indicate that implementing Conv-LSTM networks increases the productivity and accuracy of deep learning models for traffic flow prediction.

Keywords: deep learning algorithms, intelligent transportation systems, spatiotemporal features, traffic flow prediction

Procedia PDF Downloads 173
7797 Improving Axial-Attention Network via Cross-Channel Weight Sharing

Authors: Nazmul Shahadat, Anthony S. Maida

Abstract:

In recent years, hypercomplex inspired neural networks improved deep CNN architectures due to their ability to share weights across input channels and thus improve cohesiveness of representations within the layers. The work described herein studies the effect of replacing existing layers in an Axial Attention ResNet with their quaternion variants that use cross-channel weight sharing to assess the effect on image classification. We expect the quaternion enhancements to produce improved feature maps with more interlinked representations. We experiment with the stem of the network, the bottleneck layer, and the fully connected backend by replacing them with quaternion versions. These modifications lead to novel architectures which yield improved accuracy performance on the ImageNet300k classification dataset. Our baseline networks for comparison were the original real-valued ResNet, the original quaternion-valued ResNet, and the Axial Attention ResNet. Since improvement was observed regardless of which part of the network was modified, there is a promise that this technique may be generally useful in improving classification accuracy for a large class of networks.

Keywords: axial attention, representational networks, weight sharing, cross-channel correlations, quaternion-enhanced axial attention, deep networks

Procedia PDF Downloads 84
7796 Use of Six-sigma Concept in Discrete Manufacturing Industry

Authors: Ignatio Madanhire, Charles Mbohwa

Abstract:

Efficiency in manufacturing is critical in raising the value of exports so as to gainfully trade on the regional and international markets. There seems to be increasing popularity of continuous improvement strategies availed to manufacturing entities, but this research study established that there has not been a similar popularity accorded to the Six Sigma methodology. Thus this work was conducted to investigate the applicability, effectiveness, usefulness, application and suitability of the Six Sigma methodology as a competitiveness option for discrete manufacturing entity. Development of Six-sigma center in the country with continuous improvement information would go a long way in benefiting the entire industry

Keywords: discrete manufacturing, six-sigma, continuous improvement, efficiency, competitiveness

Procedia PDF Downloads 466
7795 A Development of Portable Intrinsically Safe Explosion-Proof Type of Dual Gas Detector

Authors: Sangguk Ahn, Youngyu Kim, Jaheon Gu, Gyoutae Park

Abstract:

In this paper, we developed a dual gas leak instrument to detect Hydrocarbon (HC) and Monoxide (CO) gases. To two kinds of gases, it is necessary to design compact structure for sensors. And then it is important to draw sensing circuits such as measuring, amplifying and filtering. After that, it should be well programmed with robust, systematic and module coding methods. In center of them, improvement of accuracy and initial response time are a matter of vital importance. To manufacture distinguished gas leak detector, we applied intrinsically safe explosion-proof structure to lithium ion battery, main circuits, a pump with motor, color LCD interfaces and sensing circuits. On software, to enhance measuring accuracy we used numerical analysis such as Lagrange and Neville interpolation. Performance test result is conducted by using standard Methane with seven different concentrations with three other products. We want raise risk prevention and efficiency of gas safe management through distributing to the field of gas safety. Acknowledgment: This study was supported by Small and Medium Business Administration under the research theme of ‘Commercialized Development of a portable intrinsically safe explosion-proof type dual gas leak detector’, (task number S2456036).

Keywords: gas leak, dual gas detector, intrinsically safe, explosion proof

Procedia PDF Downloads 228
7794 Dimensional Accuracy of CNTs/PMMA Parts and Holes Produced by Laser Cutting

Authors: A. Karimzad Ghavidel, M. Zadshakouyan

Abstract:

Laser cutting is a very common production method for cutting 2D polymeric parts. Developing of polymer composites with nano-fibers makes important their other properties like laser workability. The aim of this research is investigation of the influence different laser cutting conditions on the dimensional accuracy of parts and holes from poly methyl methacrylate (PMMA)/carbon nanotubes (CNTs) material. Experiments were carried out by considering of CNTs (in four level 0,0.5, 1 and 1.5% wt.%), laser power (60, 80, and 100 watt) and cutting speed 20, 30, and 40 mm/s as input variable factors. The results reveal that CNTs adding improves the laser workability of PMMA and the increasing of power has a significant effect on the part and hole size. The findings also show cutting speed is effective parameter on the size accuracy. Eventually, the statistical analysis of results was done, and calculated mathematical equations by the regression are presented for determining relation between input and output factor.

Keywords: dimensional accuracy, PMMA, CNTs, laser cutting

Procedia PDF Downloads 307
7793 Data-Driven Approach to Predict Inpatient's Estimated Discharge Date

Authors: Ayliana Dharmawan, Heng Yong Sheng, Zhang Xiaojin, Tan Thai Lian

Abstract:

To facilitate discharge planning, doctors are presently required to assign an Estimated Discharge Date (EDD) for each patient admitted to the hospital. This assignment of the EDD is largely based on the doctor’s judgment. This can be difficult for cases which are complex or relatively new to the doctor. It is hypothesized that a data-driven approach would be able to facilitate the doctors to make accurate estimations of the discharge date. Making use of routinely collected data on inpatient discharges between January 2013 and May 2016, a predictive model was developed using machine learning techniques to predict the Length of Stay (and hence the EDD) of inpatients, at the point of admission. The predictive performance of the model was compared to that of the clinicians using accuracy measures. Overall, the best performing model was found to be able to predict EDD with an accuracy improvement in Average Squared Error (ASE) by -38% as compared to the first EDD determined by the present method. It was found that important predictors of the EDD include the provisional diagnosis code, patient’s age, attending doctor at admission, medical specialty at admission, accommodation type, and the mean length of stay of the patient in the past year. The predictive model can be used as a tool to accurately predict the EDD.

Keywords: inpatient, estimated discharge date, EDD, prediction, data-driven

Procedia PDF Downloads 174
7792 Improvement of Bearing Capacity of Soft Clay Using Geo-Cells

Authors: Siddhartha Paul, Aman Harlalka, Ashim K. Dey

Abstract:

Soft clayey soil possesses poor bearing capacity and high compressibility because of which foundations cannot be directly placed over soft clay. Normally pile foundations are constructed to carry the load through the soft soil up to the hard stratum below. Pile construction is costly and time consuming. In order to increase the properties of soft clay, many ground improvement techniques like stone column, preloading with and without sand drains/band drains, etc. are in vogue. Time is a constraint for successful application of these improvement techniques. Another way to improve the bearing capacity of soft clay and to reduce the settlement possibility is to apply geocells below the foundation. The geocells impart rigidity to the foundation soil, reduce the net load intensity on soil and thus reduce the compressibility. A well designed geocell reinforced soil may replace the pile foundation. The present paper deals with the applicability of geocells on improvement of the bearing capacity. It is observed that a properly designed geocell may increase the bearing capacity of soft clay up to two and a half times.

Keywords: bearing capacity, geo-cell, ground improvement, soft clay

Procedia PDF Downloads 322
7791 Decision Making about the Environmental Management Implementation: Incentives and Expectations

Authors: Eva Štěpánková

Abstract:

Environmental management implementation is presently one of the ways of organization success and value improvement. Increasing an organization motivation to environmental measures introduction is caused primarily by the rising pressure of the society that generates various incentives to endeavor for the environmental performance improvement. The aim of the paper is to identify and characterize the key incentives and expectations leading organizations to the environmental management implementation. The author focuses on five businesses of different size and field, operating in the Czech Republic. The qualitative approach and grounded theory procedure are used in research. The results point out that the significant incentives for environmental management implementation represent primarily demands of customers, the opportunity to declare the environmental commitment and image improvement. The researched enterprises less commonly expect the economical contribution, competitive advantage increase or export rate improvement. The results show that marketing contributions are primarily expected from the environmental management implementation.

Keywords: environmental management, environmental management system, ISO 14001, Czech Republic

Procedia PDF Downloads 386
7790 Shear Reinforcement of Stone Columns During Soil Liquefaction

Authors: Zeineb Ben Salem, Wissem Frikha, Mounir Bouassida

Abstract:

The aim of this paper is to assess the effectiveness of stone columns as a liquefaction countermeasure focusing on shear reinforcementbenefit. In fact, stone columns which have high shear modulus relative to the surrounding soils potentially can carry higher shear stress levels. Thus, stone columns provide shear reinforcement and decrease the Cyclic Shear Stress Ratio CSR to which the treated soils would be subjected during an earthquake. In order to quantify the level of shear stress reduction in reinforced soil, several approaches have been developed. Nevertheless, the available approaches do not take into account the improvement of the soil parameters, mainly the shear modulusdue to stone columns installation. Indeed, in situ control tests carried out before and after the installation of stone columns based upon the results of collected data derived from 24 case histories have given evidence of the improvement of the existing soil properties.In this paper, the assessment of shear reinforcement of stone columns that accounts such improvement of the soil parameters due to stone column installation is investigated. Comparative results indicate that considering the improvement effects considerably affect the assessment of shear reinforcement for liquefaction analysis of reinforced soil by stone columns.

Keywords: stone column, liquefaction, shear reinforcement, CSR, soil improvement

Procedia PDF Downloads 153
7789 Measurement of IMRT Dose Distribution in Rando Head and Neck Phantom using EBT3 Film

Authors: Pegah Safavi, Mehdi Zehtabian, Mohammad Amin Mosleh-Shirazi

Abstract:

Cancer is one of the leading causes of death in the world. Radiation therapy is one of the main choices for cancer treatment. Intensity-modulated radiation therapy is a new type of radiation therapy technique available for vital structures such as the parathyroid glands. It is very important to check the accuracy of the delivered IMRT treatment because any mistake may lead to more complications for the patient. This paper describes an experiment to determine the accuracy of a dose measured by EBT3 film. To test this method, the EBT3 film on the head and neck of the Rando phantom was irradiated by an IMRT device and the irradiation was repeated twice. Finally, the dose designed by the irradiation system was compared with the dose measured by the EBT3 film. Using this criterion, the accuracy of the EBT3 film was evaluated. When using this criterion, a 95% agreement was reached between the planned treatment and the measured values.

Keywords: EBT3, phantom, accuracy, cancer, IMRT

Procedia PDF Downloads 150
7788 Relationship between Quality Improvement Strategies on the Basis of Different Management Activities

Authors: Manjinder Singh, Anish Sachdeva

Abstract:

Research on total quality management (TQM), total productive maintenance (TPM), international organization for standardization (ISO) and six sigma generally investigate the implementation and impact of these programs in isolation. However, none of these quality improvement programs is self-sufficient and they may not be powerful enough to deliver the improvements and innovations that are required nowadays to ensure the survival and growth of a firm. They are not mutually exclusive and inconsistent. On the contrary, they need complementary support and may reinforce mutually to make use of their complementarity, inducement of side-effects in favor of other quality improvement program, mutual simulation and exploitation of shared values. In this paper, first of all, the various management activities were identified which are normally under focus when any quality improvement program is implemented in any organization. Then TOPSIS methodology was applied to establish the ranking of various quality improvement programs (total quality management, total productive maintenance, ISO and six sigma which were brought to the corporate boardroom to improve the quality) with respect to different management activities (operations related activities, quality related activities, maintenance related activities, organizational related activities, human related activities and finance related activities).

Keywords: total productive maintenance (TPM), total quality management (TQM), TOPSIS, international organization for standardization (ISO)

Procedia PDF Downloads 441
7787 Quality and Quality Assurance in Education: Examining the Possible Relationship

Authors: Rodoula Stavroula Gkarnara, Nikolaos Andreadakis

Abstract:

The purpose of this paper is to examine the relationship between quality and quality assurance in education. It constitutes a critical review of the bibliography regarding quality and its delimitation in the field of education, as well as the quality assurance in education and the approaches identified for its extensive study. The two prevailing and opposite views on the correlation of the two concepts are that on the one hand there is an inherent distance between these concepts as they are two separate terms and on the other hand they are interrelated and interdependent concepts that contribute to the improvement of quality in education. Finally, the last part of the paper, adopting the second view, refers to the contribution of quality assurance to quality, where it is pointed out that the first concept leads to the improvement of the latter by quality assurance being the means of feedback for the quality achieved.

Keywords: education, quality, quality assurance, quality improvement

Procedia PDF Downloads 217
7786 The Effect of Improvement Programs in the Mean Time to Repair and in the Mean Time between Failures on Overall Lead Time: A Simulation Using the System Dynamics-Factory Physics Model

Authors: Marcel Heimar Ribeiro Utiyama, Fernanda Caveiro Correia, Dario Henrique Alliprandini

Abstract:

The importance of the correct allocation of improvement programs is of growing interest in recent years. Due to their limited resources, companies must ensure that their financial resources are directed to the correct workstations in order to be the most effective and survive facing the strong competition. However, to our best knowledge, the literature about allocation of improvement programs does not analyze in depth this problem when the flow shop process has two capacity constrained resources. This is a research gap which is deeply studied in this work. The purpose of this work is to identify the best strategy to allocate improvement programs in a flow shop with two capacity constrained resources. Data were collected from a flow shop process with seven workstations in an industrial control and automation company, which process 13.690 units on average per month. The data were used to conduct a simulation with the System Dynamics-Factory Physics model. The main variables considered, due to their importance on lead time reduction, were the mean time between failures and the mean time to repair. The lead time reduction was the output measure of the simulations. Ten different strategies were created: (i) focused time to repair improvement, (ii) focused time between failures improvement, (iii) distributed time to repair improvement, (iv) distributed time between failures improvement, (v) focused time to repair and time between failures improvement, (vi) distributed time to repair and between failures improvement, (vii) hybrid time to repair improvement, (viii) hybrid time between failures improvements, (ix) time to repair improvement strategy towards the two capacity constrained resources, (x) time between failures improvement strategy towards the two capacity constrained resources. The ten strategies tested are variations of the three main strategies for improvement programs named focused, distributed and hybrid. Several comparisons among the effect of the ten strategies in lead time reduction were performed. The results indicated that for the flow shop analyzed, the focused strategies delivered the best results. When it is not possible to perform a large investment on the capacity constrained resources, companies should use hybrid approaches. An important contribution to the academy is the hybrid approach, which proposes a new way to direct the efforts of improvements. In addition, the study in a flow shop with two strong capacity constrained resources (more than 95% of utilization) is an important contribution to the literature. Another important contribution is the problem of allocation with two CCRs and the possibility of having floating capacity constrained resources. The results provided the best improvement strategies considering the different strategies of allocation of improvement programs and different positions of the capacity constrained resources. Finally, it is possible to state that both strategies, hybrid time to repair improvement and hybrid time between failures improvement, delivered best results compared to the respective distributed strategies. The main limitations of this study are mainly regarding the flow shop analyzed. Future work can further investigate different flow shop configurations like a varying number of workstations, different number of products or even different positions of the two capacity constrained resources.

Keywords: allocation of improvement programs, capacity constrained resource, hybrid strategy, lead time, mean time to repair, mean time between failures

Procedia PDF Downloads 124
7785 Application of Lean Six Sigma Tools to Minimize Time and Cost in Furniture Packaging

Authors: Suleiman Obeidat, Nabeel Mandahawi

Abstract:

In this work, the packaging process for a move is improved. The customers of this move need their household stuff to be moved from their current house to the new one with minimum damage, in an organized manner, on time and with the minimum cost. Our goal was to improve the process between 10% and 20% time efficiency, 90% reduction in damaged parts and an acceptable improvement in the cost of the total move process. The expected ROI was 833%. Many improvement techniques have been used in terms of the way the boxes are prepared, their preparation cost, packing the goods, labeling them and moving them to a place for moving out. DMAIC technique is used in this work: SIPOC diagram, value stream map of “As Is” process, Root Cause Analysis, Maps of “Future State” and “Ideal State” and an Improvement Plan. A value of ROI=624% is obtained which is lower than the expected value of 833%. The work explains the techniques of improvement and the deficiencies in the old process.

Keywords: packaging, lean tools, six sigma, DMAIC methodology, SIPOC

Procedia PDF Downloads 428
7784 The Students' Mathematical Competency and Attitude towards Mathematics Using the Trachtenberg Speed Math System

Authors: Marlone D. Severo

Abstract:

A pre- and post-test quasi-experimental design was used to test the intervention of Trachtenberg Speed Math on the mathematical competency of sixty (60) matched-paired students with a poor performing grade in Mathematics from one of the biggest public national high school at the South of Metro Manila. Both control and experimental group were administered with the Attitude Towards Mathematics Inventory (ATMI) before the pretest were given and both group showed high dislike for Mathematics. Pretest showed a 53 percent accuracy for the control group and 51 percent for the experimental group using a 15-item long multiplication test without any aid of a computing device. The experimental group were taught how to use the Trachtenberg number-keys and techniques in multiplication between October 2014 to March 2015. Post-test showed an improvement in the experimental group with 96 percent accuracy for the control group and a dismal 57 percent for the control group in long-multiplication. Post-test ATMI were administered. The control group showed a great dislike towards Mathematics, while the experimental group showed a positive attitude towards the subject.

Keywords: attitude towards mathematics, mathematical competency, number-keys, trachtenberg speed math

Procedia PDF Downloads 369
7783 An Appraisal of the Utilization of the New International Academy of Cytology Yokohama Standardized Reporting System: A Study of Diagnostic Accuracy and Calculation of the Risk of Malignancy Along with Histopathological Correlation in Fine-Needle Aspiratio

Authors: Deepika Gupta, Namita Bhutani, Mithlesh Bhargav

Abstract:

Objective: Breast cancer is the most commonly encountered lesion in females after non-melanoma skin malignancies. The triple assessment is an important approach in pre-operative lesions in developing countries. The objectives of the present study were to determine diagnostic accuracy and calculation of ROM along with cyto-histopathological correlation with the incorporation of a new IAC reporting system. Material and Methods: A total of 940 FNAC slides were retrieved from December 2019 to December 2020 and categorized according to the new IAC system. The diagnostic accuracy and calculation of ROM, along with cyto-histopathological correlation, were determined. Results: All the breast FNAC lesions were categorized from C1 to C5. Of the 940 cases, 358 cases had cyto-histopathological correlation. The ROM was ranging from 0% to 99%. All the statistical parameters were calculated along with diagnostic accuracy which was 97%. Conclusion: The new IAC standardized reporting system of breast FNAC evoked the utilization of rapid, accurate, and low-cost diagnostic tests and broadened the understanding and application of breast FNAC.

Keywords: accuracy, fine needle aspiration cytology, IAC, risk of malignancy, Yokohama system

Procedia PDF Downloads 11