Search results for: accuracy improvement
6797 River Stage-Discharge Forecasting Based on Multiple-Gauge Strategy Using EEMD-DWT-LSSVM Approach
Authors: Farhad Alizadeh, Alireza Faregh Gharamaleki, Mojtaba Jalilzadeh, Houshang Gholami, Ali Akhoundzadeh
Abstract:
This study presented hybrid pre-processing approach along with a conceptual model to enhance the accuracy of river discharge prediction. In order to achieve this goal, Ensemble Empirical Mode Decomposition algorithm (EEMD), Discrete Wavelet Transform (DWT) and Mutual Information (MI) were employed as a hybrid pre-processing approach conjugated to Least Square Support Vector Machine (LSSVM). A conceptual strategy namely multi-station model was developed to forecast the Souris River discharge more accurately. The strategy used herein was capable of covering uncertainties and complexities of river discharge modeling. DWT and EEMD was coupled, and the feature selection was performed for decomposed sub-series using MI to be employed in multi-station model. In the proposed feature selection method, some useless sub-series were omitted to achieve better performance. Results approved efficiency of the proposed DWT-EEMD-MI approach to improve accuracy of multi-station modeling strategies.Keywords: river stage-discharge process, LSSVM, discrete wavelet transform, Ensemble Empirical Decomposition Mode, multi-station modeling
Procedia PDF Downloads 1756796 Development of Fuzzy Logic and Neuro-Fuzzy Surface Roughness Prediction Systems Coupled with Cutting Current in Milling Operation
Authors: Joseph C. Chen, Venkata Mohan Kudapa
Abstract:
Development of two real-time surface roughness (Ra) prediction systems for milling operations was attempted. The systems used not only cutting parameters, such as feed rate and spindle speed, but also the cutting current generated and corrected by a clamp type energy sensor. Two different approaches were developed. First, a fuzzy inference system (FIS), in which the fuzzy logic rules are generated by experts in the milling processes, was used to conduct prediction modeling using current cutting data. Second, a neuro-fuzzy system (ANFIS) was explored. Neuro-fuzzy systems are adaptive techniques in which data are collected on the network, processed, and rules are generated by the system. The inference system then uses these rules to predict Ra as the output. Experimental results showed that the parameters of spindle speed, feed rate, depth of cut, and input current variation could predict Ra. These two systems enable the prediction of Ra during the milling operation with an average of 91.83% and 94.48% accuracy by FIS and ANFIS systems, respectively. Statistically, the ANFIS system provided better prediction accuracy than that of the FIS system.Keywords: surface roughness, input current, fuzzy logic, neuro-fuzzy, milling operations
Procedia PDF Downloads 1466795 Roasting Degree of Cocoa Beans by Artificial Neural Network (ANN) Based Electronic Nose System and Gas Chromatography (GC)
Authors: Juzhong Tan, William Kerr
Abstract:
Roasting is one critical procedure in chocolate processing, where special favors are developed, moisture content is decreased, and better processing properties are developed. Therefore, determination of roasting degree of cocoa bean is important for chocolate manufacturers to ensure the quality of chocolate products, and it also decides the commercial value of cocoa beans collected from cocoa farmers. The roasting degree of cocoa beans currently relies on human specialists, who sometimes are biased, and chemical analysis, which take long time and are inaccessible to many manufacturers and farmers. In this study, a self-made electronic nose system consists of gas sensors (TGS 800 and 2000 series) was used to detecting the gas generated by cocoa beans with a different roasting degree (0min, 20min, 30min, and 40min) and the signals collected by gas sensors were used to train a three-layers ANN. Chemical analysis of the graded beans was operated by traditional GC-MS system and the contents of volatile chemical compounds were used to train another ANN as a reference to electronic nosed signals trained ANN. Both trained ANN were used to predict cocoa beans with a different roasting degree for validation. The best accuracy of grading achieved by electronic nose signals trained ANN (using signals from TGS 813 826 820 880 830 2620 2602 2610) turned out to be 96.7%, however, the GC trained ANN got the accuracy of 83.8%.Keywords: artificial neutron network, cocoa bean, electronic nose, roasting
Procedia PDF Downloads 2356794 Fusion Models for Cyber Threat Defense: Integrating Clustering, Random Forests, and Support Vector Machines to Against Windows Malware
Authors: Azita Ramezani, Atousa Ramezani
Abstract:
In the ever-escalating landscape of windows malware the necessity for pioneering defense strategies turns into undeniable this study introduces an avant-garde approach fusing the capabilities of clustering random forests and support vector machines SVM to combat the intricate web of cyber threats our fusion model triumphs with a staggering accuracy of 98.67 and an equally formidable f1 score of 98.68 a testament to its effectiveness in the realm of windows malware defense by deciphering the intricate patterns within malicious code our model not only raises the bar for detection precision but also redefines the paradigm of cybersecurity preparedness this breakthrough underscores the potential embedded in the fusion of diverse analytical methodologies and signals a paradigm shift in fortifying against the relentless evolution of windows malicious threats as we traverse through the dynamic cybersecurity terrain this research serves as a beacon illuminating the path toward a resilient future where innovative fusion models stand at the forefront of cyber threat defense.Keywords: fusion models, cyber threat defense, windows malware, clustering, random forests, support vector machines (SVM), accuracy, f1-score, cybersecurity, malicious code detection
Procedia PDF Downloads 726793 Study of Natural Patterns on Digital Image Correlation Using Simulation Method
Authors: Gang Li, Ghulam Mubashar Hassan, Arcady Dyskin, Cara MacNish
Abstract:
Digital image correlation (DIC) is a contactless full-field displacement and strain reconstruction technique commonly used in the field of experimental mechanics. Comparing with physical measuring devices, such as strain gauges, which only provide very restricted coverage and are expensive to deploy widely, the DIC technique provides the result with full-field coverage and relative high accuracy using an inexpensive and simple experimental setup. It is very important to study the natural patterns effect on the DIC technique because the preparation of the artificial patterns is time consuming and hectic process. The objective of this research is to study the effect of using images having natural pattern on the performance of DIC. A systematical simulation method is used to build simulated deformed images used in DIC. A parameter (subset size) used in DIC can have an effect on the processing and accuracy of DIC and even cause DIC to failure. Regarding to the picture parameters (correlation coefficient), the higher similarity of two subset can lead the DIC process to fail and make the result more inaccurate. The pictures with good and bad quality for DIC methods have been presented and more importantly, it is a systematic way to evaluate the quality of the picture with natural patterns before they install the measurement devices.Keywords: Digital Image Correlation (DIC), deformation simulation, natural pattern, subset size
Procedia PDF Downloads 4216792 Medical Image Watermark and Tamper Detection Using Constant Correlation Spread Spectrum Watermarking
Authors: Peter U. Eze, P. Udaya, Robin J. Evans
Abstract:
Data hiding can be achieved by Steganography or invisible digital watermarking. For digital watermarking, both accurate retrieval of the embedded watermark and the integrity of the cover image are important. Medical image security in Teleradiology is one of the applications where the embedded patient record needs to be extracted with accuracy as well as the medical image integrity verified. In this research paper, the Constant Correlation Spread Spectrum digital watermarking for medical image tamper detection and accurate embedded watermark retrieval is introduced. In the proposed method, a watermark bit from a patient record is spread in a medical image sub-block such that the correlation of all watermarked sub-blocks with a spreading code, W, would have a constant value, p. The constant correlation p, spreading code, W and the size of the sub-blocks constitute the secret key. Tamper detection is achieved by flagging any sub-block whose correlation value deviates by more than a small value, ℇ, from p. The major features of our new scheme include: (1) Improving watermark detection accuracy for high-pixel depth medical images by reducing the Bit Error Rate (BER) to Zero and (2) block-level tamper detection in a single computational process with simultaneous watermark detection, thereby increasing utility with the same computational cost.Keywords: Constant Correlation, Medical Image, Spread Spectrum, Tamper Detection, Watermarking
Procedia PDF Downloads 1976791 Comparing the SALT and START Triage System in Disaster and Mass Casualty Incidents: A Systematic Review
Authors: Hendri Purwadi, Christine McCloud
Abstract:
Triage is a complex decision-making process that aims to categorize a victim’s level of acuity and the need for medical assistance. Two common triage systems have been widely used in Mass Casualty Incidents (MCIs) and disaster situation are START (Simple triage algorithm and rapid treatment) and SALT (sort, asses, lifesaving, intervention, and treatment/transport). There is currently controversy regarding the effectiveness of SALT over START triage system. This systematic review aims to investigate and compare the effectiveness between SALT and START triage system in disaster and MCIs setting. Literatures were searched via systematic search strategy from 2009 until 2019 in PubMed, Cochrane Library, CINAHL, Scopus, Science direct, Medlib, ProQuest. This review included simulated-based and medical record -based studies investigating the accuracy and applicability of SALT and START triage systems of adult and children population during MCIs and disaster. All type of studies were included. Joana Briggs institute critical appraisal tools were used to assess the quality of reviewed studies. As a result, 1450 articles identified in the search, 10 articles were included. Four themes were identified by review, they were accuracy, under-triage, over-triage and time to triage per individual victim. The START triage system has a wide range and inconsistent level of accuracy compared to SALT triage system (44% to 94. 2% of START compared to 70% to 83% of SALT). The under-triage error of START triage system ranged from 2.73% to 20%, slightly lower than SALT triage system (7.6 to 23.3%). The over-triage error of START triage system was slightly greater than SALT triage system (START ranged from 2% to 53% compared to 2% to 22% of SALT). The time for applying START triage system was faster than SALT triage system (START was 70-72.18 seconds compared to 78 second of SALT). Consequently; The START triage system has lower level of under-triage error and faster than SALT triage system in classifying victims of MCIs and disaster whereas SALT triage system is known slightly more accurate and lower level of over-triage. However, the magnitude of these differences is relatively small, and therefore the effect on the patient outcomes is not significance. Hence, regardless of the triage error, either START or SALT triage system is equally effective to triage victims of disaster and MCIs.Keywords: disaster, effectiveness, mass casualty incidents, START triage system, SALT triage system
Procedia PDF Downloads 1336790 Cerebral Pulsatility Mediates the Link Between Physical Activity and Executive Functions in Older Adults with Cardiovascular Risk Factors: A Longitudinal NIRS Study
Authors: Hanieh Mohammadi, Sarah Fraser, Anil Nigam, Frederic Lesage, Louis Bherer
Abstract:
A chronically higher cerebral pulsatility is thought to damage cerebral microcirculation, leading to cognitive decline in older adults. Although it is widely known that regular physical activity is linked to improvement in some cognitive domains, including executive functions, the mediating role of cerebral pulsatility on this link remains to be elucidated. This study assessed the impact of 6 months of regular physical activity upon changes in an optical index of cerebral pulsatility and the role of physical activity for the improvement of executive functions. 27 older adults (aged 57-79, 66.7% women) with cardiovascular risk factors (CVRF) were enrolled in the study. The participants completed the behavioral Stroop test, which was extracted from the Delis-Kaplan executive functions system battery at baseline (T0) and after 6 months (T6) of physical activity. Near-infrared spectroscopy (NIRS) was applied for an innovative approach to indexing cerebral pulsatility in the brain microcirculation at T0 and T6. The participants were at standing rest while a NIRS device recorded hemodynamics data from frontal and motor cortex subregions at T0 and T6. The cerebral pulsatility index of interest was cerebral pulse amplitude, which was extracted from the pulsatile component of NIRS data. Our data indicated that 6 months of physical activity was associated with a reduction in the response time for the executive functions, including inhibition (T0: 56.33± 18.2 to T6: 53.33± 15.7,p= 0.038)and Switching(T0: 63.05± 5.68 to T6: 57.96 ±7.19,p< 0.001) conditions of the Stroop test. Also, physical activity was associated with a reduction in cerebral pulse amplitude (T0: 0.62± 0.05 to T6: 0.55± 0.08, p < 0.001). Notably, cerebral pulse amplitude was a significant mediator of the link between physical activity and response to the Stroop test for both inhibition (β=0.33 (0.61,0.23),p< 0.05)and switching (β=0.42 (0.69,0.11),p <0.01) conditions. This study suggests that regular physical activity may support cognitive functions through the improvement of cerebral pulsatility in older adults with CVRF.Keywords: near-infrared spectroscopy, cerebral pulsatility, physical activity, cardiovascular risk factors, executive functions
Procedia PDF Downloads 1956789 The Accuracy of an In-House Developed Computer-Assisted Surgery Protocol for Mandibular Micro-Vascular Reconstruction
Authors: Christophe Spaas, Lies Pottel, Joke De Ceulaer, Johan Abeloos, Philippe Lamoral, Tom De Backer, Calix De Clercq
Abstract:
We aimed to evaluate the accuracy of an in-house developed low-cost computer-assisted surgery (CAS) protocol for osseous free flap mandibular reconstruction. All patients who underwent primary or secondary mandibular reconstruction with a free (solely or composite) osseous flap, either a fibula free flap or iliac crest free flap, between January 2014 and December 2017 were evaluated. The low-cost protocol consisted out of a virtual surgical planning, a prebend custom reconstruction plate and an individualized free flap positioning guide. The accuracy of the protocol was evaluated through comparison of the postoperative outcome with the 3D virtual planning, based on measurement of the following parameters: intercondylar distance, mandibular angle (axial and sagittal), inner angular distance, anterior-posterior distance, length of the fibular/iliac crest segments and osteotomy angles. A statistical analysis of the obtained values was done. Virtual 3D surgical planning and cutting guide design were performed with Proplan CMF® software (Materialise, Leuven, Belgium) and IPS Gate (KLS Martin, Tuttlingen, Germany). Segmentation of the DICOM data as well as outcome analysis were done with BrainLab iPlan® Software (Brainlab AG, Feldkirchen, Germany). A cost analysis of the protocol was done. Twenty-two patients (11 fibula /11 iliac crest) were included and analyzed. Based on voxel-based registration on the cranial base, 3D virtual planning landmark parameters did not significantly differ from those measured on the actual treatment outcome (p-values >0.05). A cost evaluation of the in-house developed CAS protocol revealed a 1750 euro cost reduction in comparison with a standard CAS protocol with a patient-specific reconstruction plate. Our results indicate that an accurate transfer of the planning with our in-house developed low-cost CAS protocol is feasible at a significant lower cost.Keywords: CAD/CAM, computer-assisted surgery, low-cost, mandibular reconstruction
Procedia PDF Downloads 1436788 Solving Process Planning, Weighted Apparent Tardiness Cost Dispatching, and Weighted Processing plus Weight Due-Date Assignment Simultaneously Using a Hybrid Search
Authors: Halil Ibrahim Demir, Caner Erden, Abdullah Hulusi Kokcam, Mumtaz Ipek
Abstract:
Process planning, scheduling, and due date assignment are three important manufacturing functions which are studied independently in literature. There are hundreds of works on IPPS and SWDDA problems but a few works on IPPSDDA problem. Integrating these three functions is very crucial due to the high relationship between them. Since the scheduling problem is in the NP-Hard problem class without any integration, an integrated problem is even harder to solve. This study focuses on the integration of these functions. Sum of weighted tardiness, earliness, and due date related costs are used as a penalty function. Random search and hybrid metaheuristics are used to solve the integrated problem. Marginal improvement in random search is very high in the early iterations and reduces enormously in later iterations. At that point directed search contribute to marginal improvement more than random search. In this study, random and genetic search methods are combined to find better solutions. Results show that overall performance becomes better as the integration level increases.Keywords: process planning, genetic algorithm, hybrid search, random search, weighted due-date assignment, weighted scheduling
Procedia PDF Downloads 3646787 Hot Air Flow Annealing of MAPbI₃ Perovskite: Structural and Optical Properties
Authors: Mouad Ouafi, Lahoucine Atourki, Larbi Laanab, Erika Vega, Miguel Mollar, Bernabe Marib, Boujemaa Jaber
Abstract:
Despite the astonishing emergence of the methylammonium lead triiodide perovskite as a promising light harvester for solar cells, their physical properties in solution-processed MAPbI₃ are still crucial and need to be improved. The objective of this work is to investigate the hot airflow effect during the growth of MAPbI₃ films using the spin-coating process on their structural, optical and morphological proprieties. The experimental results show that many physical proprieties of the perovskite strongly depend on the air flow temperature and the optimization which has a beneficial effect on the perovskite quality. In fact, a clear improvement of the crystallinity and the crystallite size of MAPbI₃ perovskite is demonstrated by the XRD analyses, when the airflow temperature is increased up to 100°C. Alternatively, as far as the surface morphology is concerned, SEM micrographs show that significant homogenous nucleation, uniform surface distribution and pin holes free with highest surface coverture of 98% are achieved when the airflow temperature reaches 100°C. At this temperature, the improvement is also observed when considering the optical properties of the films. By contrast, a remarkable degradation of the MAPbI₃ perovskites associated to the PbI₂ phase formation is noticed, when the hot airflow temperature is higher than 100°C, especially 300°C.Keywords: hot air flow, crystallinity, surface coverage, perovskite morphology
Procedia PDF Downloads 1646786 Autonomous Vehicle Detection and Classification in High Resolution Satellite Imagery
Authors: Ali J. Ghandour, Houssam A. Krayem, Abedelkarim A. Jezzini
Abstract:
High-resolution satellite images and remote sensing can provide global information in a fast way compared to traditional methods of data collection. Under such high resolution, a road is not a thin line anymore. Objects such as cars and trees are easily identifiable. Automatic vehicles enumeration can be considered one of the most important applications in traffic management. In this paper, autonomous vehicle detection and classification approach in highway environment is proposed. This approach consists mainly of three stages: (i) first, a set of preprocessing operations are applied including soil, vegetation, water suppression. (ii) Then, road networks detection and delineation is implemented using built-up area index, followed by several morphological operations. This step plays an important role in increasing the overall detection accuracy since vehicles candidates are objects contained within the road networks only. (iii) Multi-level Otsu segmentation is implemented in the last stage, resulting in vehicle detection and classification, where detected vehicles are classified into cars and trucks. Accuracy assessment analysis is conducted over different study areas to show the great efficiency of the proposed method, especially in highway environment.Keywords: remote sensing, object identification, vehicle and road extraction, vehicle and road features-based classification
Procedia PDF Downloads 2336785 Determination of Potential Agricultural Lands Using Landsat 8 OLI Images and GIS: Case Study of Gokceada (Imroz) Turkey
Authors: Rahmi Kafadar, Levent Genc
Abstract:
In present study, it was aimed to determine potential agricultural lands (PALs) in Gokceada (Imroz) Island of Canakkale province, Turkey. Seven-band Landsat 8 OLI images acquired on July 12 and August 13, 2013, and their 14-band combination image were used to identify current Land Use Land Cover (LULC) status. Principal Component Analysis (PCA) was applied to three Landsat datasets in order to reduce the correlation between the bands. A total of six Original and PCA images were classified using supervised classification method to obtain the LULC maps including 6 main classes (“Forest”, “Agriculture”, “Water Surface”, “Residential Area-Bare Soil”, “Reforestation” and “Other”). Accuracy assessment was performed by checking the accuracy of 120 randomized points for each LULC maps. The best overall accuracy and Kappa statistic values (90.83%, 0.8791% respectively) were found for PCA images which were generated from 14-bands combined images called 3-B/JA. Digital Elevation Model (DEM) with 15 m spatial resolution (ASTER) was used to consider topographical characteristics. Soil properties were obtained by digitizing 1:25000 scaled soil maps of rural services directorate general. Potential Agricultural Lands (PALs) were determined using Geographic information Systems (GIS). Procedure was applied considering that “Other” class of LULC map may be used for agricultural purposes in the future properties. Overlaying analysis was conducted using Slope (S), Land Use Capability Class (LUCC), Other Soil Properties (OSP) and Land Use Capability Sub-Class (SUBC) properties. A total of 901.62 ha areas within “Other” class (15798.2 ha) of LULC map were determined as PALs. These lands were ranked as “Very Suitable”, “Suitable”, “Moderate Suitable” and “Low Suitable”. It was determined that the 8.03 ha were classified as “Very Suitable” while 18.59 ha as suitable and 11.44 ha as “Moderate Suitable” for PALs. In addition, 756.56 ha were found to be “Low Suitable”. The results obtained from this preliminary study can serve as basis for further studies.Keywords: digital elevation model (DEM), geographic information systems (GIS), gokceada (Imroz), lANDSAT 8 OLI-TIRS, land use land cover (LULC)
Procedia PDF Downloads 3556784 Neural Machine Translation for Low-Resource African Languages: Benchmarking State-of-the-Art Transformer for Wolof
Authors: Cheikh Bamba Dione, Alla Lo, Elhadji Mamadou Nguer, Siley O. Ba
Abstract:
In this paper, we propose two neural machine translation (NMT) systems (French-to-Wolof and Wolof-to-French) based on sequence-to-sequence with attention and transformer architectures. We trained our models on a parallel French-Wolof corpus of about 83k sentence pairs. Because of the low-resource setting, we experimented with advanced methods for handling data sparsity, including subword segmentation, back translation, and the copied corpus method. We evaluate the models using the BLEU score and find that transformer outperforms the classic seq2seq model in all settings, in addition to being less sensitive to noise. In general, the best scores are achieved when training the models on word-level-based units. For subword-level models, using back translation proves to be slightly beneficial in low-resource (WO) to high-resource (FR) language translation for the transformer (but not for the seq2seq) models. A slight improvement can also be observed when injecting copied monolingual text in the target language. Moreover, combining the copied method data with back translation leads to a substantial improvement of the translation quality.Keywords: backtranslation, low-resource language, neural machine translation, sequence-to-sequence, transformer, Wolof
Procedia PDF Downloads 1476783 Impact of Increased Radiology Staffing on After-Hours Radiology Reporting Efficiency and Quality
Authors: Peregrine James Dalziel, Philip Vu Tran
Abstract:
Objective / Introduction: Demand for radiology services from Emergency Departments (ED) continues to increase with greater demands placed on radiology staff providing reports for the management of complex cases. Queuing theory indicates that wide variability of process time with the random nature of request arrival increases the probability of significant queues. This can lead to delays in the time-to-availability of radiology reports (TTA-RR) and potentially impaired ED patient flow. In addition, greater “cognitive workload” of greater volume may lead to reduced productivity and increased errors. We sought to quantify the potential ED flow improvements obtainable from increased radiology providers serving 3 public hospitals in Melbourne Australia. We sought to assess the potential productivity gains, quality improvement and the cost-effectiveness of increased labor inputs. Methods & Materials: The Western Health Medical Imaging Department moved from single resident coverage on weekend days 8:30 am-10:30 pm to a limited period of 2 resident coverage 1 pm-6 pm on both weekend days. The TTA-RR for weekend CT scans was calculated from the PACs database for the 8 month period symmetrically around the date of staffing change. A multivariate linear regression model was developed to isolate the improvement in TTA-RR, between the two 4-months periods. Daily and hourly scan volume at the time of each CT scan was calculated to assess the impact of varying department workload. To assess any improvement in report quality/errors a random sample of 200 studies was assessed to compare the average number of clinically significant over-read addendums to reports between the 2 periods. Cost-effectiveness was assessed by comparing the marginal cost of additional staffing against a conservative estimate of the economic benefit of improved ED patient throughput using the Australian national insurance rebate for private ED attendance as a revenue proxy. Results: The primary resident on call and the type of scan accounted for most of the explained variability in time to report availability (R2=0.29). Increasing daily volume and hourly volume was associated with increased TTA-RR (1.5m (p<0.01) and 4.8m (p<0.01) respectively per additional scan ordered within each time frame. Reports were available 25.9 minutes sooner on average in the 4 months post-implementation of double coverage (p<0.01) with additional 23.6 minutes improvement when 2 residents were on-site concomitantly (p<0.01). The aggregate average improvement in TTA-RR was 24.8 hours per weekend day This represents the increased decision-making time available to ED physicians and potential improvement in ED bed utilisation. 5% of reports from the intervention period contained clinically significant addendums vs 7% in the single resident period but this was not statistically significant (p=0.7). The marginal cost was less than the anticipated economic benefit based assuming a 50% capture of improved TTA-RR inpatient disposition and using the lowest available national insurance rebate as a proxy for economic benefit. Conclusion: TTA-RR improved significantly during the period of increased staff availability, both during the specific period of increased staffing and throughout the day. Increased labor utilisation is cost-effective compared with the potential improved productivity for ED cases requiring CT imaging.Keywords: workflow, quality, administration, CT, staffing
Procedia PDF Downloads 1146782 An Automatic Speech Recognition of Conversational Telephone Speech in Malay Language
Authors: M. Draman, S. Z. Muhamad Yassin, M. S. Alias, Z. Lambak, M. I. Zulkifli, S. N. Padhi, K. N. Baharim, F. Maskuriy, A. I. A. Rahim
Abstract:
The performance of Malay automatic speech recognition (ASR) system for the call centre environment is presented. The system utilizes Kaldi toolkit as the platform to the entire library and algorithm used in performing the ASR task. The acoustic model implemented in this system uses a deep neural network (DNN) method to model the acoustic signal and the standard (n-gram) model for language modelling. With 80 hours of training data from the call centre recordings, the ASR system can achieve 72% of accuracy that corresponds to 28% of word error rate (WER). The testing was done using 20 hours of audio data. Despite the implementation of DNN, the system shows a low accuracy owing to the varieties of noises, accent and dialect that typically occurs in Malaysian call centre environment. This significant variation of speakers is reflected by the large standard deviation of the average word error rate (WERav) (i.e., ~ 10%). It is observed that the lowest WER (13.8%) was obtained from recording sample with a standard Malay dialect (central Malaysia) of native speaker as compared to 49% of the sample with the highest WER that contains conversation of the speaker that uses non-standard Malay dialect.Keywords: conversational speech recognition, deep neural network, Malay language, speech recognition
Procedia PDF Downloads 3236781 Simultaneous Improvement of Wear Performance and Toughness of Ledeburitic Tool Steels by Sub-Zero Treatment
Authors: Peter Jurči, Jana Ptačinová, Mária Hudáková, Mária Dománková, Martin Kusý, Martin Sahul
Abstract:
The strength, hardness, and toughness (ductility) are in strong conflict for the metallic materials. The only possibility how to make their simultaneous improvement is to provide the microstructural refinement, by cold deformation, and subsequent recrystallization. However, application of this kind of treatment is impossible for high-carbon high-alloyed ledeburitic tool steels. Alternatively, it has been demonstrated over the last few years that sub-zero treatment induces some microstructural changes in these materials, which might favourably influence their complex of mechanical properties. Commercially available PM ledeburitic steel Vanadis 6 has been used for the current investigations. The paper demonstrates that sub-zero treatment induces clear refinement of the martensite, reduces the amount of retained austenite, enhances the population density of fine carbides, and makes alterations in microstructural development that take place during tempering. As a consequence, the steel manifests improved wear resistance at higher toughness and fracture toughness. Based on the obtained results, the key question “can the wear performance be improved by sub-zero treatment simultaneously with toughness” can be answered by “definitely yes”.Keywords: ledeburitic tool steels, microstructure, sub-zero treatment, mechanical properties
Procedia PDF Downloads 3196780 Evaluation of the Internal Quality for Pineapple Based on the Spectroscopy Approach and Neural Network
Authors: Nonlapun Meenil, Pisitpong Intarapong, Thitima Wongsheree, Pranchalee Samanpiboon
Abstract:
In Thailand, once pineapples are harvested, they must be classified into two classes based on their sweetness: sweet and unsweet. This paper has studied and developed the assessment of internal quality of pineapples using a low-cost compact spectroscopy sensor according to the Spectroscopy approach and Neural Network (NN). During the experiments, Batavia pineapples were utilized, generating 100 samples. The extracted pineapple juice of each sample was used to determine the Soluble Solid Content (SSC) labeling into sweet and unsweet classes. In terms of experimental equipment, the sensor cover was specifically designed to install the sensor and light source to read the reflectance at a five mm depth from pineapple flesh. By using a spectroscopy sensor, data on visible and near-infrared reflectance (Vis-NIR) were collected. The NN was used to classify the pineapple classes. Before the classification step, the preprocessing methods, which are Class balancing, Data shuffling, and Standardization were applied. The 510 nm and 900 nm reflectance values of the middle parts of pineapples were used as features of the NN. With the Sequential model and Relu activation function, 100% accuracy of the training set and 76.67% accuracy of the test set were achieved. According to the abovementioned information, using a low-cost compact spectroscopy sensor has achieved favorable results in classifying the sweetness of the two classes of pineapples.Keywords: neural network, pineapple, soluble solid content, spectroscopy
Procedia PDF Downloads 796779 The Impacts of Negative Moral Characters on Health: An Article Review
Authors: Mansoor Aslamzai, Delaqa Del, Sayed Azam Sajid
Abstract:
Introduction: Though moral disorders have a high burden, there is no separate topic regarding this problem in the International Classification of Diseases (ICD). Along with the modification of WHO ICD-11, spirituality can prevent the rapid progress of such derangement as well. Objective: This study evaluated the effects of bad moral characters on health, as well as carried out the role of spirituality in the improvement of immorality. Method: This narrative article review was accomplished in 2020-2021 and the articles were searched through the Web of Science, PubMed, BMC, and Google scholar. Results: Based on the current review, most experimental and observational studies revealed significant negative effects of unwell moral characters on the overall aspects of health and well-being. Nowadays, a lot of studies established the positive role of spirituality in the improvement of health and moral disorder. The studies concluded, facilities must be available within schools, universities, and communities for everyone to learn the knowledge of spirituality and improve their unwell moral character world. Conclusion: Considering the negative relationship between unwell moral characters and well-being, the current study proposes the addition of moral disorder as a separate topic in the WHO International Classification of Diseases. Based on this literature review, spirituality will improve moral disorder and establish excellent moral traits.Keywords: bad moral characters, effect, health, spirituality and well-being
Procedia PDF Downloads 1856778 Evaluation of Football Forecasting Models: 2021 Brazilian Championship Case Study
Authors: Flavio Cordeiro Fontanella, Asla Medeiros e Sá, Moacyr Alvim Horta Barbosa da Silva
Abstract:
In the present work, we analyse the performance of football results forecasting models. In order to do so, we have performed the data collection from eight different forecasting models during the 2021 Brazilian football season. First, we guide the analysis through visual representations of the data, designed to highlight the most prominent features and enhance the interpretation of differences and similarities between the models. We propose using a 2-simplex triangle to investigate visual patterns from the results forecasting models. Next, we compute the expected points for every team playing in the championship and compare them to the final league standings, revealing interesting contrasts between actual to expected performances. Then, we evaluate forecasts’ accuracy using the Ranked Probability Score (RPS); models comparison accounts for tiny scale differences that may become consistent in time. Finally, we observe that the Wisdom of Crowds principle can be appropriately applied in the context, driving into a discussion of results forecasts usage in practice. This paper’s primary goal is to encourage football forecasts’ performance discussion. We hope to accomplish it by presenting appropriate criteria and easy-to-understand visual representations that can point out the relevant factors of the subject.Keywords: accuracy evaluation, Brazilian championship, football results forecasts, forecasting models, visual analysis
Procedia PDF Downloads 966777 Comprehensive Machine Learning-Based Glucose Sensing from Near-Infrared Spectra
Authors: Bitewulign Mekonnen
Abstract:
Context: This scientific paper focuses on the use of near-infrared (NIR) spectroscopy to determine glucose concentration in aqueous solutions accurately and rapidly. The study compares six different machine learning methods for predicting glucose concentration and also explores the development of a deep learning model for classifying NIR spectra. The objective is to optimize the detection model and improve the accuracy of glucose prediction. This research is important because it provides a comprehensive analysis of various machine-learning techniques for estimating aqueous glucose concentrations. Research Aim: The aim of this study is to compare and evaluate different machine-learning methods for predicting glucose concentration from NIR spectra. Additionally, the study aims to develop and assess a deep-learning model for classifying NIR spectra. Methodology: The research methodology involves the use of machine learning and deep learning techniques. Six machine learning regression models, including support vector machine regression, partial least squares regression, extra tree regression, random forest regression, extreme gradient boosting, and principal component analysis-neural network, are employed to predict glucose concentration. The NIR spectra data is randomly divided into train and test sets, and the process is repeated ten times to increase generalization ability. In addition, a convolutional neural network is developed for classifying NIR spectra. Findings: The study reveals that the SVMR, ETR, and PCA-NN models exhibit excellent performance in predicting glucose concentration, with correlation coefficients (R) > 0.99 and determination coefficients (R²)> 0.985. The deep learning model achieves high macro-averaging scores for precision, recall, and F1-measure. These findings demonstrate the effectiveness of machine learning and deep learning methods in optimizing the detection model and improving glucose prediction accuracy. Theoretical Importance: This research contributes to the field by providing a comprehensive analysis of various machine-learning techniques for estimating glucose concentrations from NIR spectra. It also explores the use of deep learning for the classification of indistinguishable NIR spectra. The findings highlight the potential of machine learning and deep learning in enhancing the prediction accuracy of glucose-relevant features. Data Collection and Analysis Procedures: The NIR spectra and corresponding references for glucose concentration are measured in increments of 20 mg/dl. The data is randomly divided into train and test sets, and the models are evaluated using regression analysis and classification metrics. The performance of each model is assessed based on correlation coefficients, determination coefficients, precision, recall, and F1-measure. Question Addressed: The study addresses the question of whether machine learning and deep learning methods can optimize the detection model and improve the accuracy of glucose prediction from NIR spectra. Conclusion: The research demonstrates that machine learning and deep learning methods can effectively predict glucose concentration from NIR spectra. The SVMR, ETR, and PCA-NN models exhibit superior performance, while the deep learning model achieves high classification scores. These findings suggest that machine learning and deep learning techniques can be used to improve the prediction accuracy of glucose-relevant features. Further research is needed to explore their clinical utility in analyzing complex matrices, such as blood glucose levels.Keywords: machine learning, signal processing, near-infrared spectroscopy, support vector machine, neural network
Procedia PDF Downloads 956776 Genetic Diversity of Sorghum bicolor (L.) Moench Genotypes as Revealed by Microsatellite Markers
Authors: Maletsema Alina Mofokeng, Hussein Shimelis, Mark Laing, Pangirayi Tongoona
Abstract:
Sorghum is one of the most important cereal crops grown for food, feed and bioenergy. Knowledge of genetic diversity is important for conservation of genetic resources and improvement of crop plants through breeding. The objective of this study was to assess the level of genetic diversity among sorghum genotypes using microsatellite markers. A total of 103 accessions of sorghum genotypes obtained from the Department of Agriculture, Forestry and Fisheries, the African Centre for Crop Improvement and Agricultural Research Council-Grain Crops Institute collections in South Africa were estimated using 30 microsatellite markers. For all the loci analysed, 306 polymorphic alleles were detected with a mean value of 6.4 per locus. The polymorphic information content had an average value of 0.50 with heterozygosity mean value of 0.55 suggesting an important genetic diversity within the sorghum genotypes used. The unweighted pair group method with arithmetic mean clustering based on Euclidian coefficients revealed two major distinct groups without allocating genotypes based on the source of collection or origin. The genotypes 4154.1.1.1, 2055.1.1.1, 4441.1.1.1, 4442.1.1.1, 4722.1.1.1, and 4606.1.1.1 were the most diverse. The sorghum genotypes with high genetic diversity could serve as important sources of novel alleles for breeding and strategic genetic conservation.Keywords: Genetic Diversity, Genotypes, Microsatellites, Sorghum
Procedia PDF Downloads 3766775 The Quality of Food and Drink Product Labels Translation from Indonesian into English
Authors: Rudi Hartono, Bambang Purwanto
Abstract:
The translation quality of food and drink labels from Indonesian into English is poor because the translation is not accurate, less natural, and difficult to read. The label translation can be found in some cans packages of food and drink products produced and marketed by several companies in Indonesia. If this problem is left unchecked, it will lead to a misunderstanding on the translation results and make consumers confused. This study was conducted to analyze the translation errors on food and drink products labels and formulate the solution for the better translation quality. The research design was the evaluation research with a holistic criticism approach. The data used were words, phrases, and sentences translated from Indonesian to English language printed on food and drink product labels. The data were processed by using Interactive Model Analysis that carried out three main steps: collecting, classifying, and verifying data. Furthermore, the data were analyzed by using content analysis to view the accuracy, naturalness, and readability of translation. The results showed that the translation quality of food and drink product labels from Indonesian to English has the level of accuracy (60%), level of naturalness (50%), and level readability (60%). This fact needs a help to create an effective strategy for translating food and drink product labels later.Keywords: translation quality, food and drink product labels, a holistic criticism approach, interactive model, content analysis
Procedia PDF Downloads 3756774 Nelder-Mead Parametric Optimization of Elastic Metamaterials with Artificial Neural Network Surrogate Model
Authors: Jiaqi Dong, Qing-Hua Qin, Yi Xiao
Abstract:
Some of the most fundamental challenges of elastic metamaterials (EMMs) optimization can be attributed to the high consumption of computational power resulted from finite element analysis (FEA) simulations that render the optimization process inefficient. Furthermore, due to the inherent mesh dependence of FEA, minuscule geometry features, which often emerge during the later stages of optimization, induce very fine elements, resulting in enormously high time consumption, particularly when repetitive solutions are needed for computing the objective function. In this study, a surrogate modelling algorithm is developed to reduce computational time in structural optimization of EMMs. The surrogate model is constructed based on a multilayer feedforward artificial neural network (ANN) architecture, trained with prepopulated eigenfrequency data prepopulated from FEA simulation and optimized through regime selection with genetic algorithm (GA) to improve its accuracy in predicting the location and width of the primary elastic band gap. With the optimized ANN surrogate at the core, a Nelder-Mead (NM) algorithm is established and its performance inspected in comparison to the FEA solution. The ANNNM model shows remarkable accuracy in predicting the band gap width and a reduction of time consumption by 47%.Keywords: artificial neural network, machine learning, mechanical metamaterials, Nelder-Mead optimization
Procedia PDF Downloads 1286773 Short Term Effects of Mobilization with Movement in a Patient with Fibromyalgia: A Case Report
Authors: S. F. Kanaan, Fatima Al-Kadi, H. Khrais
Abstract:
Background: Fibromyalgia is a chronic condition that is characterized by chronic pain that limits physical and functional activities. To our best knowledge, there is currently no key physiotherapy approach recommended to reduce pain and improve function. In addition, there are scarce studies that investigated the effect of manual therapy in the management of Fibromyalgia, and no study investigated the efficacy of Mulligan´s mobilization with movement (MWM) in particular. Methods: A 51-year-old female diagnosed with Fibromyalgia for more than a year. The patient was complaining of generalized pain including neck, lower back, shoulders, elbows, hips, and knees. In addition, the patient reported severe limitation in activities and inability to complete her work as a lawyer. The Intervention provided for the patient consisted of 4 sessions (in two weeks) of MWM for neck, lower back, shoulders, elbows, sacroiliac joint, hips, and knees. The Visual Analogue Scale of pain (VAS), Range of Motion (ROM), 10-minute walk test, Roland Morris Low Back Pain and Disability Questionnaire (RMQ), Disability of the Arm, Shoulder and Hand Score (DASH) were collected at the baseline and at the end of treatment. Results: Average improvement of ROM in the neck, lower back, shoulder, elbows, hips, and knees was 45%. VAS scale changed from pre-treatment to post-treatment as the following: neck pain (9 to 0), lower back pain (8 to 1), shoulders pain (8 to 2), elbows pain (7 to 1), and knees pain (9 to 0). The patient demonstrated improvement in all functional scale from pre-intervention to post-intervention: 10-meter walk test (9.8 to 4.5 seconds), RMQ (21 to 11/24), and DASH (88.7% to 40.5%). The patient did not report any side effect of using this approach. Conclusion: Fibromyalgia can cause joint 'faulty position' leading to pain and dysfunction, which can be reversed by using MWM. MWM showed to have clinically significant improvement in ROM, pain, and ability to walk and a clinically significant reduction in disability in only 4 sessions. This work can be expanded in a larger sample.Keywords: mobilization, fibromyalgia, dysfunction, manual therapy
Procedia PDF Downloads 1746772 Sinhala Sign Language to Grammatically Correct Sentences using NLP
Authors: Anjalika Fernando, Banuka Athuraliya
Abstract:
This paper presents a comprehensive approach for converting Sinhala Sign Language (SSL) into grammatically correct sentences using Natural Language Processing (NLP) techniques in real-time. While previous studies have explored various aspects of SSL translation, the research gap lies in the absence of grammar checking for SSL. This work aims to bridge this gap by proposing a two-stage methodology that leverages deep learning models to detect signs and translate them into coherent sentences, ensuring grammatical accuracy. The first stage of the approach involves the utilization of a Long Short-Term Memory (LSTM) deep learning model to recognize and interpret SSL signs. By training the LSTM model on a dataset of SSL gestures, it learns to accurately classify and translate these signs into textual representations. The LSTM model achieves a commendable accuracy rate of 94%, demonstrating its effectiveness in accurately recognizing and translating SSL gestures. Building upon the successful recognition and translation of SSL signs, the second stage of the methodology focuses on improving the grammatical correctness of the translated sentences. The project employs a Neural Machine Translation (NMT) architecture, consisting of an encoder and decoder with LSTM components, to enhance the syntactical structure of the generated sentences. By training the NMT model on a parallel corpus of Sinhala wrong sentences and their corresponding grammatically correct translations, it learns to generate coherent and grammatically accurate sentences. The NMT model achieves an impressive accuracy rate of 98%, affirming its capability to produce linguistically sound translations. The proposed approach offers significant contributions to the field of SSL translation and grammar correction. Addressing the critical issue of grammar checking, it enhances the usability and reliability of SSL translation systems, facilitating effective communication between hearing-impaired and non-sign language users. Furthermore, the integration of deep learning techniques, such as LSTM and NMT, ensures the accuracy and robustness of the translation process. This research holds great potential for practical applications, including educational platforms, accessibility tools, and communication aids for the hearing-impaired. Furthermore, it lays the foundation for future advancements in SSL translation systems, fostering inclusive and equal opportunities for the deaf community. Future work includes expanding the existing datasets to further improve the accuracy and generalization of the SSL translation system. Additionally, the development of a dedicated mobile application would enhance the accessibility and convenience of SSL translation on handheld devices. Furthermore, efforts will be made to enhance the current application for educational purposes, enabling individuals to learn and practice SSL more effectively. Another area of future exploration involves enabling two-way communication, allowing seamless interaction between sign-language users and non-sign-language users.In conclusion, this paper presents a novel approach for converting Sinhala Sign Language gestures into grammatically correct sentences using NLP techniques in real time. The two-stage methodology, comprising an LSTM model for sign detection and translation and an NMT model for grammar correction, achieves high accuracy rates of 94% and 98%, respectively. By addressing the lack of grammar checking in existing SSL translation research, this work contributes significantly to the development of more accurate and reliable SSL translation systems, thereby fostering effective communication and inclusivity for the hearing-impaired communityKeywords: Sinhala sign language, sign Language, NLP, LSTM, NMT
Procedia PDF Downloads 1076771 ‘Groupitizing’ – A Key Factor in Math Learning Disabilities
Authors: Michal Wolk, Bat-Sheva Hadad, Orly Rubinsten
Abstract:
Objective: The visuospatial perception system process that allows us to decompose and recompose small quantities into a whole is often called “groupitizing.” Previous studies have been found that adults use groupitizing processes in quantity estimation tasks and link this ability of subgroups recognition to arithmetic proficiency. This pilot study examined if adults with math difficulties benefit from visuospatial grouping cues when asked to estimate the quantity of a given set. It also compared the tipping point in which a significant improvement occurs in adults with typical development compared to adults with math difficulties. Method: In this pilot research, we recruited adults with low arithmetic abilities and matched controls. Participants were asked to estimate the quantity of a given set. Different grouping cues were displayed (space, color, or none) with different visual configurations (different quantities-different shapes, same quantities- different shapes, same quantities- same shapes). Results: Both groups showed significant performance improvement when grouping cues appeared. However, adults with low arithmetic abilities benefited from the grouping cues already in very small quantities as four. Conclusion: impaired perceptual groupitizing abilities may be a characteristic of low arithmetic abilities.Keywords: groupitizing, math learning disability, quantity estimation, visual perception system
Procedia PDF Downloads 2056770 Rice Area Determination Using Landsat-Based Indices and Land Surface Temperature Values
Authors: Burçin Saltık, Levent Genç
Abstract:
In this study, it was aimed to determine a route for identification of rice cultivation areas within Thrace and Marmara regions of Turkey using remote sensing and GIS. Landsat 8 (OLI-TIRS) imageries acquired in production season of 2013 with 181/32 Path/Row number were used. Four different seasonal images were generated utilizing original bands and different transformation techniques. All images were classified individually using supervised classification techniques and Land Use Land Cover Maps (LULC) were generated with 8 classes. Areas (ha, %) of each classes were calculated. In addition, district-based rice distribution maps were developed and results of these maps were compared with Turkish Statistical Institute (TurkSTAT; TSI)’s actual rice cultivation area records. Accuracy assessments were conducted, and most accurate map was selected depending on accuracy assessment and coherency with TSI results. Additionally, rice areas on over 4° slope values were considered as mis-classified pixels and they eliminated using slope map and GIS tools. Finally, randomized rice zones were selected to obtain maximum-minimum value ranges of each date (May, June, July, August, September images separately) NDVI, LSWI, and LST images to test whether they may be used for rice area determination via raster calculator tool of ArcGIS. The most accurate classification for rice determination was obtained from seasonal LSWI LULC map, and considering TSI data and accuracy assessment results and mis-classified pixels were eliminated from this map. According to results, 83151.5 ha of rice areas exist within study area. However, this result is higher than TSI records with an area of 12702.3 ha. Use of maximum-minimum range of rice area NDVI, LSWI, and LST was tested in Meric district. It was seen that using the value ranges obtained from July imagery, gave the closest results to TSI records, and the difference was only 206.4 ha. This difference is normal due to relatively low resolution of images. Thus, employment of images with higher spectral, spatial, temporal and radiometric resolutions may provide more reliable results.Keywords: landsat 8 (OLI-TIRS), LST, LSWI, LULC, NDVI, rice
Procedia PDF Downloads 2286769 Repair Workshop Queue System Modification Using Priority Scheme
Authors: C. Okonkwo Ugochukwu, E. Sinebe Jude, N. Odoh Blessing, E. Okafor Christian
Abstract:
In this paper, a modification on repair workshop queuing system using multi priority scheme was carried out. Chi square goodness of fit test was used to determine the random distribution of the inter arrival time and service time of crankshafts that come for maintenance in the workshop. The chi square values obtained for all the prioritized classes show that the distribution conforms to Poisson distribution. The mean waiting time in queue results of non-preemptive priority for 1st, 2nd and 3rd classes show 0.066, 0.09, and 0.224 day respectively, while preemptive priority show 0.007, 0.036 and 0.258 day. However, when non priority is used, which obviously has no class distinction it amounts to 0.17 days. From the results, one can observe that the preemptive priority system provides a very dramatic improvement over the non preemptive priority as it concerns arrivals that are of higher priority. However, the improvement has a detrimental effect on the low priority class. The trend of the results is similar to the mean waiting time in the system as a result of addition of the actual service time. Even though the mean waiting time for the queue and that of the system for no priority takes the least time when compared with the least priority, urgent and semi-urgent jobs will terribly suffer which will most likely result in reneging or balking of many urgent jobs. Hence, the adoption of priority scheme in this type of scenario will result in huge profit to the Company and more customer satisfaction.Keywords: queue, priority class, preemptive, non-preemptive, mean waiting time
Procedia PDF Downloads 3986768 A New Approach for Solving Fractional Coupled Pdes
Authors: Prashant Pandey
Abstract:
In the present article, an effective Laguerre collocation method is used to obtain the approximate solution of a system of coupled fractional-order non-linear reaction-advection-diffusion equation with prescribed initial and boundary conditions. In the proposed scheme, Laguerre polynomials are used together with an operational matrix and collocation method to obtain approximate solutions of the coupled system, so that our proposed model is converted into a system of algebraic equations which can be solved employing the Newton method. The solution profiles of the coupled system are presented graphically for different particular cases. The salient feature of the present article is finding the stability analysis of the proposed method and also the demonstration of the lower variation of solute concentrations with respect to the column length in the fractional-order system compared to the integer-order system. To show the higher efficiency, reliability, and accuracy of the proposed scheme, a comparison between the numerical results of Burger’s coupled system and its existing analytical result is reported. There are high compatibility and consistency between the approximate solution and its exact solution to a higher order of accuracy. The exhibition of error analysis for each case through tables and graphs confirms the super-linearly convergence rate of the proposed method.Keywords: fractional coupled PDE, stability and convergence analysis, diffusion equation, Laguerre polynomials, spectral method
Procedia PDF Downloads 146