Search results for: quantification accuracy
1641 Tracking Filtering Algorithm Based on ConvLSTM
Authors: Ailing Yang, Penghan Song, Aihua Cai
Abstract:
The nonlinear maneuvering target tracking problem is mainly a state estimation problem when the target motion model is uncertain. Traditional solutions include Kalman filtering based on Bayesian filtering framework and extended Kalman filtering. However, these methods need prior knowledge such as kinematics model and state system distribution, and their performance is poor in state estimation of nonprior complex dynamic systems. Therefore, in view of the problems existing in traditional algorithms, a convolution LSTM target state estimation (SAConvLSTM-SE) algorithm based on Self-Attention memory (SAM) is proposed to learn the historical motion state of the target and the error distribution information measured at the current time. The measured track point data of airborne radar are processed into data sets. After supervised training, the data-driven deep neural network based on SAConvLSTM can directly obtain the target state at the next moment. Through experiments on two different maneuvering targets, we find that the network has stronger robustness and better tracking accuracy than the existing tracking methods.Keywords: maneuvering target, state estimation, Kalman filter, LSTM, self-attention
Procedia PDF Downloads 1771640 Design of an Instrumentation Setup and Data Acquisition System for a GAS Turbine Engine Using Suitable DAQ Software
Authors: Syed Nauman Bin Asghar Bukhari, Mohtashim Mansoor, Mohammad Nouman
Abstract:
Engine test-Bed system is a fundamental tool to measure dynamic parameters, economic performance, and reliability of an aircraft Engine, and its automation and accuracy directly influences the precision of acquired and analysed data. In this paper, we present the design of digital Data Acquisition (DAQ) system for a vintage aircraft engine test bed that lacks the capability of displaying all the analyzed parameters at one convenient location (one panel-one screen). Recording such measurements in the vintage test bed is not only time consuming but also prone to human errors. Digitizing such measurement system requires a Data Acquisition (DAQ) system capable of recording these parameters and displaying them on one screen-one panel monitor. The challenge in designing upgrade to the vintage systems arises with a need to build and integrate digital measurement system from scratch with a minimal budget and modifications to the existing vintage system. The proposed design not only displays all the key performance / maintenance parameters of the gas turbine engines for operator as well as quality inspector on separate screens but also records the data for further processing / archiving.Keywords: Gas turbine engine, engine test cell, data acquisition, instrumentation
Procedia PDF Downloads 1231639 Diagnosis of Alzheimer Diseases in Early Step Using Support Vector Machine (SVM)
Authors: Amira Ben Rabeh, Faouzi Benzarti, Hamid Amiri, Mouna Bouaziz
Abstract:
Alzheimer is a disease that affects the brain. It causes degeneration of nerve cells (neurons) and in particular cells involved in memory and intellectual functions. Early diagnosis of Alzheimer Diseases (AD) raises ethical questions, since there is, at present, no cure to offer to patients and medicines from therapeutic trials appear to slow the progression of the disease as moderate, accompanying side effects sometimes severe. In this context, analysis of medical images became, for clinical applications, an essential tool because it provides effective assistance both at diagnosis therapeutic follow-up. Computer Assisted Diagnostic systems (CAD) is one of the possible solutions to efficiently manage these images. In our work; we proposed an application to detect Alzheimer’s diseases. For detecting the disease in early stage we used the three sections: frontal to extract the Hippocampus (H), Sagittal to analysis the Corpus Callosum (CC) and axial to work with the variation features of the Cortex(C). Our method of classification is based on Support Vector Machine (SVM). The proposed system yields a 90.66% accuracy in the early diagnosis of the AD.Keywords: Alzheimer Diseases (AD), Computer Assisted Diagnostic(CAD), hippocampus, Corpus Callosum (CC), cortex, Support Vector Machine (SVM)
Procedia PDF Downloads 3841638 Jordan Water District Interactive Billing and Accounting Information System
Authors: Adrian J. Forca, Simeon J. Cainday III
Abstract:
The Jordan Water District Interactive Billing and Accounting Information Systems is designed for Jordan Water District to uplift the efficiency and effectiveness of its services to its customers. It is designed to process computations of water bills in accurate and fast way through automating the manual process and ensures that correct rates and fees are applied. In addition to billing process, a mobile app will be integrated into it to support rapid and accurate water bill generation. An interactive feature will be incorporated to support electronic billing to customers who wish to receive water bills through the use of electronic mail. The system will also improve, organize and avoid data inaccuracy in accounting processes because data will be stored in a database which is designed logically correct through normalization. Furthermore, strict programming constraints will be plunged to validate account access privilege based on job function and data being stored and retrieved to ensure data security, reliability, and accuracy. The system will be able to cater the billing and accounting services of Jordan Water District resulting in setting forth the manual process and adapt to the modern technological innovations.Keywords: accounting, bill, information system, interactive
Procedia PDF Downloads 2511637 3D Numerical Investigation of Asphalt Pavements Behaviour Using Infinite Elements
Authors: K. Sandjak, B. Tiliouine
Abstract:
This article presents the main results of three-dimensional (3-D) numerical investigation of asphalt pavement structures behaviour using a coupled Finite Element-Mapped Infinite Element (FE-MIE) model. The validation and numerical performance of this model are assessed by confronting critical pavement responses with Burmister’s solution and FEM simulation results for multi-layered elastic structures. The coupled model is then efficiently utilised to perform 3-D simulations of a typical asphalt pavement structure in order to investigate the impact of two tire configurations (conventional dual and new generation wide-base tires) on critical pavement response parameters. The numerical results obtained show the effectiveness and the accuracy of the coupled (FE-MIE) model. In addition, the simulation results indicate that, compared with conventional dual tire assembly, single wide base tire caused slightly greater fatigue asphalt cracking and subgrade rutting potentials and can thus be utilised in view of its potential to provide numerous mechanical, economic, and environmental benefits.Keywords: 3-D numerical investigation, asphalt pavements, dual and wide base tires, Infinite elements
Procedia PDF Downloads 2151636 A Case Study of Deep Learning for Disease Detection in Crops
Authors: Felipe A. Guth, Shane Ward, Kevin McDonnell
Abstract:
In the precision agriculture area, one of the main tasks is the automated detection of diseases in crops. Machine Learning algorithms have been studied in recent decades for such tasks in view of their potential for improving economic outcomes that automated disease detection may attain over crop fields. The latest generation of deep learning convolution neural networks has presented significant results in the area of image classification. In this way, this work has tested the implementation of an architecture of deep learning convolution neural network for the detection of diseases in different types of crops. A data augmentation strategy was used to meet the requirements of the algorithm implemented with a deep learning framework. Two test scenarios were deployed. The first scenario implemented a neural network under images extracted from a controlled environment while the second one took images both from the field and the controlled environment. The results evaluated the generalisation capacity of the neural networks in relation to the two types of images presented. Results yielded a general classification accuracy of 59% in scenario 1 and 96% in scenario 2.Keywords: convolutional neural networks, deep learning, disease detection, precision agriculture
Procedia PDF Downloads 2591635 High-Resolution Computed Tomography Imaging Features during Pandemic 'COVID-19'
Authors: Sahar Heidary, Ramin Ghasemi Shayan
Abstract:
By the development of new coronavirus (2019-nCoV) pneumonia, chest high-resolution computed tomography (HRCT) has been one of the main investigative implements. To realize timely and truthful diagnostics, defining the radiological features of the infection is of excessive value. The purpose of this impression was to consider the imaging demonstrations of early-stage coronavirus disease 2019 (COVID-19) and to run an imaging base for a primary finding of supposed cases and stratified interference. The right prophetic rate of HRCT was 85%, sensitivity was 73% for all patients. Total accuracy was 68%. There was no important change in these values for symptomatic and asymptomatic persons. These consequences were besides free of the period of X-ray from the beginning of signs or interaction. Therefore, we suggest that HRCT is a brilliant attachment for early identification of COVID-19 pneumonia in both symptomatic and asymptomatic individuals in adding to the role of predictive gauge for COVID-19 pneumonia. Patients experienced non-contrast HRCT chest checkups and images were restored in a thin 1.25 mm lung window. Images were estimated for the existence of lung scratches & a CT severity notch was allocated separately for each patient based on the number of lung lobes convoluted.Keywords: COVID-19, radiology, respiratory diseases, HRCT
Procedia PDF Downloads 1421634 Enhancing a Recidivism Prediction Tool with Machine Learning: Effectiveness and Algorithmic Fairness
Authors: Marzieh Karimihaghighi, Carlos Castillo
Abstract:
This work studies how Machine Learning (ML) may be used to increase the effectiveness of a criminal recidivism risk assessment tool, RisCanvi. The two key dimensions of this analysis are predictive accuracy and algorithmic fairness. ML-based prediction models obtained in this study are more accurate at predicting criminal recidivism than the manually-created formula used in RisCanvi, achieving an AUC of 0.76 and 0.73 in predicting violent and general recidivism respectively. However, the improvements are small, and it is noticed that algorithmic discrimination can easily be introduced between groups such as national vs foreigner, or young vs old. It is described how effectiveness and algorithmic fairness objectives can be balanced, applying a method in which a single error disparity in terms of generalized false positive rate is minimized, while calibration is maintained across groups. Obtained results show that this bias mitigation procedure can substantially reduce generalized false positive rate disparities across multiple groups. Based on these results, it is proposed that ML-based criminal recidivism risk prediction should not be introduced without applying algorithmic bias mitigation procedures.Keywords: algorithmic fairness, criminal risk assessment, equalized odds, recidivism
Procedia PDF Downloads 1521633 Predicting Potential Protein Therapeutic Candidates from the Gut Microbiome
Authors: Prasanna Ramachandran, Kareem Graham, Helena Kiefel, Sunit Jain, Todd DeSantis
Abstract:
Microbes that reside inside the mammalian GI tract, commonly referred to as the gut microbiome, have been shown to have therapeutic effects in animal models of disease. We hypothesize that specific proteins produced by these microbes are responsible for this activity and may be used directly as therapeutics. To speed up the discovery of these key proteins from the big-data metagenomics, we have applied machine learning techniques. Using amino acid sequences of known epitopes and their corresponding binding partners, protein interaction descriptors (PID) were calculated, making a positive interaction set. A negative interaction dataset was calculated using sequences of proteins known not to interact with these same binding partners. Using Random Forest and positive and negative PID, a machine learning model was trained and used to predict interacting versus non-interacting proteins. Furthermore, the continuous variable, cosine similarity in the interaction descriptors was used to rank bacterial therapeutic candidates. Laboratory binding assays were conducted to test the candidates for their potential as therapeutics. Results from binding assays reveal the accuracy of the machine learning prediction and are subsequently used to further improve the model.Keywords: protein-interactions, machine-learning, metagenomics, microbiome
Procedia PDF Downloads 3761632 Evaluation of Features Extraction Algorithms for a Real-Time Isolated Word Recognition System
Authors: Tomyslav Sledevič, Artūras Serackis, Gintautas Tamulevičius, Dalius Navakauskas
Abstract:
This paper presents a comparative evaluation of features extraction algorithm for a real-time isolated word recognition system based on FPGA. The Mel-frequency cepstral, linear frequency cepstral, linear predictive and their cepstral coefficients were implemented in hardware/software design. The proposed system was investigated in the speaker-dependent mode for 100 different Lithuanian words. The robustness of features extraction algorithms was tested recognizing the speech records at different signals to noise rates. The experiments on clean records show highest accuracy for Mel-frequency cepstral and linear frequency cepstral coefficients. For records with 15 dB signal to noise rate the linear predictive cepstral coefficients give best result. The hard and soft part of the system is clocked on 50 MHz and 100 MHz accordingly. For the classification purpose, the pipelined dynamic time warping core was implemented. The proposed word recognition system satisfies the real-time requirements and is suitable for applications in embedded systems.Keywords: isolated word recognition, features extraction, MFCC, LFCC, LPCC, LPC, FPGA, DTW
Procedia PDF Downloads 4951631 Valuing Social Sustainability in Agriculture: An Approach Based on Social Outputs’ Shadow Prices
Authors: Amer Ait Sidhoum
Abstract:
Interest in sustainability has gained ground among practitioners, academics and policy-makers due to growing stakeholders’ awareness of environmental and social concerns. This is particularly true for agriculture. However, relatively little research has been conducted on the quantification of social sustainability and the contribution of social issues to the agricultural production efficiency. This research's main objective is to propose a method for evaluating prices of social outputs, more precisely shadow prices, by allowing for the stochastic nature of agricultural production that is to say for production uncertainty. In this article, the assessment of social outputs’ shadow prices is conducted within the methodological framework of nonparametric Data Envelopment Analysis (DEA). An output-oriented directional distance function (DDF) is implemented to represent the technology of a sample of Catalan arable crop farms and derive the efficiency scores the overall production technology of our sample is assumed to be the intersection of two different sub-technologies. The first sub-technology models the production of random desirable agricultural outputs, while the second sub-technology reflects the social outcomes from agricultural activities. Once a nonparametric production technology has been represented, the DDF primal approach can be used for efficiency measurement, while shadow prices are drawn from the dual representation of the DDF. Computing shadow prices is a method to assign an economic value to non-marketed social outcomes. Our research uses cross sectional, farm-level data collected in 2015 from a sample of 180 Catalan arable crop farms specialized in the production of cereals, oilseeds and protein (COP) crops. Our results suggest that our sample farms show high performance scores, from 85% for the bad state of nature to 88% for the normal and ideal crop growing conditions. This suggests that farm performance is increasing with an improvement in crop growth conditions. Results also show that average shadow prices of desirable state-contingent output and social outcomes for efficient and inefficient farms are positive, suggesting that the production of desirable marketable outputs and of non-marketable outputs makes a positive contribution to the farm production efficiency. Results also indicate that social outputs’ shadow prices are contingent upon the growing conditions. The shadow prices follow an upward trend as crop-growing conditions improve. This finding suggests that these efficient farms prefer to allocate more resources in the production of desirable outputs than of social outcomes. To our knowledge, this study represents the first attempt to compute shadow prices of social outcomes while accounting for the stochastic nature of the production technology. Our findings suggest that the decision-making process of the efficient farms in dealing with social issues are stochastic and strongly dependent on the growth conditions. This implies that policy-makers should adjust their instruments according to the stochastic environmental conditions. An optimal redistribution of rural development support, by increasing the public payment with the improvement in crop growth conditions, would likely enhance the effectiveness of public policies.Keywords: data envelopment analysis, shadow prices, social sustainability, sustainable farming
Procedia PDF Downloads 1261630 Audio-Visual Recognition Based on Effective Model and Distillation
Authors: Heng Yang, Tao Luo, Yakun Zhang, Kai Wang, Wei Qin, Liang Xie, Ye Yan, Erwei Yin
Abstract:
Recent years have seen that audio-visual recognition has shown great potential in a strong noise environment. The existing method of audio-visual recognition has explored methods with ResNet and feature fusion. However, on the one hand, ResNet always occupies a large amount of memory resources, restricting the application in engineering. On the other hand, the feature merging also brings some interferences in a high noise environment. In order to solve the problems, we proposed an effective framework with bidirectional distillation. At first, in consideration of the good performance in extracting of features, we chose the light model, Efficientnet as our extractor of spatial features. Secondly, self-distillation was applied to learn more information from raw data. Finally, we proposed a bidirectional distillation in decision-level fusion. In more detail, our experimental results are based on a multi-model dataset from 24 volunteers. Eventually, the lipreading accuracy of our framework was increased by 2.3% compared with existing systems, and our framework made progress in audio-visual fusion in a high noise environment compared with the system of audio recognition without visual.Keywords: lipreading, audio-visual, Efficientnet, distillation
Procedia PDF Downloads 1341629 A Study of Adaptive Fault Detection Method for GNSS Applications
Authors: Je Young Lee, Hee Sung Kim, Kwang Ho Choi, Joonhoo Lim, Sebum Chun, Hyung Keun Lee
Abstract:
A purpose of this study is to develop efficient detection method for Global Navigation Satellite Systems (GNSS) applications based on adaptive estimation. Due to dependence of radio frequency signals, GNSS measurements are dominated by systematic errors in receiver’s operating environment. Thus, to utilize GNSS for aerospace or ground vehicles requiring high level of safety, unhealthy measurements should be considered seriously. For the reason, this paper proposes adaptive fault detection method to deal with unhealthy measurements in various harsh environments. By the proposed method, the test statistics for fault detection is generated by estimated measurement noise. Pseudorange and carrier-phase measurement noise are obtained at time propagations and measurement updates in process of Carrier-Smoothed Code (CSC) filtering, respectively. Performance of the proposed method was evaluated by field-collected GNSS measurements. To evaluate the fault detection capability, intentional faults were added to measurements. The experimental result shows that the proposed detection method is efficient in detecting unhealthy measurements and improves the accuracy of GNSS positioning under fault occurrence.Keywords: adaptive estimation, fault detection, GNSS, residual
Procedia PDF Downloads 5751628 Performance Prediction of a SANDIA 17-m Vertical Axis Wind Turbine Using Improved Double Multiple Streamtube
Authors: Abolfazl Hosseinkhani, Sepehr Sanaye
Abstract:
Different approaches have been used to predict the performance of the vertical axis wind turbines (VAWT), such as experimental, computational fluid dynamics (CFD), and analytical methods. Analytical methods, such as momentum models that use streamtubes, have low computational cost and sufficient accuracy. The double multiple streamtube (DMST) is one of the most commonly used of momentum models, which divide the rotor plane of VAWT into upwind and downwind. In fact, results from the DMST method have shown some discrepancy compared with experiment results; that is because the Darrieus turbine is a complex and aerodynamically unsteady configuration. In this study, analytical-experimental-based corrections, including dynamic stall, streamtube expansion, and finite blade length correction are used to improve the DMST method. Results indicated that using these corrections for a SANDIA 17-m VAWT will lead to improving the results of DMST.Keywords: vertical axis wind turbine, analytical, double multiple streamtube, streamtube expansion model, dynamic stall model, finite blade length correction
Procedia PDF Downloads 1351627 A Different Approach to Smart Phone-Based Wheat Disease Detection System Using Deep Learning for Ethiopia
Authors: Nathenal Thomas Lambamo
Abstract:
Based on the fact that more than 85% of the labor force and 90% of the export earnings are taken by agriculture in Ethiopia and it can be said that it is the backbone of the overall socio-economic activities in the country. Among the cereal crops that the agriculture sector provides for the country, wheat is the third-ranking one preceding teff and maize. In the present day, wheat is in higher demand related to the expansion of industries that use them as the main ingredient for their products. The local supply of wheat for these companies covers only 35 to 40% and the rest 60 to 65% percent is imported on behalf of potential customers that exhaust the country’s foreign currency reserves. The above facts show that the need for this crop in the country is too high and in reverse, the productivity of the crop is very less because of these reasons. Wheat disease is the most devastating disease that contributes a lot to this unbalance in the demand and supply status of the crop. It reduces both the yield and quality of the crop by 27% on average and up to 37% when it is severe. This study aims to detect the most frequent and degrading wheat diseases, Septoria and Leaf rust, using the most efficiently used subset of machine learning technology, deep learning. As a state of the art, a deep learning class classification technique called Convolutional Neural Network (CNN) has been used to detect diseases and has an accuracy of 99.01% is achieved.Keywords: septoria, leaf rust, deep learning, CNN
Procedia PDF Downloads 761626 Machine Learning Approach for Mutation Testing
Authors: Michael Stewart
Abstract:
Mutation testing is a type of software testing proposed in the 1970s where program statements are deliberately changed to introduce simple errors so that test cases can be validated to determine if they can detect the errors. Test cases are executed against the mutant code to determine if one fails, detects the error and ensures the program is correct. One major issue with this type of testing was it became intensive computationally to generate and test all possible mutations for complex programs. This paper used reinforcement learning and parallel processing within the context of mutation testing for the selection of mutation operators and test cases that reduced the computational cost of testing and improved test suite effectiveness. Experiments were conducted using sample programs to determine how well the reinforcement learning-based algorithm performed with one live mutation, multiple live mutations and no live mutations. The experiments, measured by mutation score, were used to update the algorithm and improved accuracy for predictions. The performance was then evaluated on multiple processor computers. With reinforcement learning, the mutation operators utilized were reduced by 50 – 100%.Keywords: automated-testing, machine learning, mutation testing, parallel processing, reinforcement learning, software engineering, software testing
Procedia PDF Downloads 1981625 Tibyan Automated Arabic Correction Using Machine-Learning in Detecting Syntactical Mistakes
Authors: Ashwag O. Maghraby, Nida N. Khan, Hosnia A. Ahmed, Ghufran N. Brohi, Hind F. Assouli, Jawaher S. Melibari
Abstract:
The Arabic language is one of the most important languages. Learning it is so important for many people around the world because of its religious and economic importance and the real challenge lies in practicing it without grammatical or syntactical mistakes. This research focused on detecting and correcting the syntactic mistakes of Arabic syntax according to their position in the sentence and focused on two of the main syntactical rules in Arabic: Dual and Plural. It analyzes each sentence in the text, using Stanford CoreNLP morphological analyzer and machine-learning approach in order to detect the syntactical mistakes and then correct it. A prototype of the proposed system was implemented and evaluated. It uses support vector machine (SVM) algorithm to detect Arabic grammatical errors and correct them using the rule-based approach. The prototype system has a far accuracy 81%. In general, it shows a set of useful grammatical suggestions that the user may forget about while writing due to lack of familiarity with grammar or as a result of the speed of writing such as alerting the user when using a plural term to indicate one person.Keywords: Arabic language acquisition and learning, natural language processing, morphological analyzer, part-of-speech
Procedia PDF Downloads 1531624 CFD-DEM Modelling and Analysis of the Continuous Separation of Sized Particles Using Inertial Microfluidics
Authors: Hui Zhu, Yuan Wang, Shibo Kuang, Aibing Yu
Abstract:
The inertial difference induced by the microfluidics inside a curved micro-channel has great potential to provide a fast, inexpensive, and portable solution to the separation of micro- and sub-micro particles in many applications such as aerosol collections, airborne bacteria and virus detections, as well as particle sortation. In this work, the separation behaviors of different sized particles inside a reported curved micro-channel have been studied by a combined approach of computational fluid dynamics for gas and discrete element model for particles (CFD-DEM). The micro-channel is operated by controlling the gas flow rates at all of its branches respectively used to load particles, introduce gas streams, collect particles of various sizes. The validity of the model has been examined by comparing by the calculated separation efficiency of different sized particles against the measurement. On this basis, the separation mechanisms of the inertial microfluidic separator are elucidated in terms of the interactions between particles, between particle and fluid, and between particle and wall. The model is then used to study the effect of feed solids concentration on the separation accuracy and efficiency. The results obtained from the present study demonstrate that the CFD-DEM approach can provide a convenient way to study the particle separation behaviors in micro-channels of various types.Keywords: CFD-DEM, inertial effect, microchannel, separation
Procedia PDF Downloads 2921623 Neural Networks for Distinguishing the Performance of Two Hip Joint Implants on the Basis of Hip Implant Side and Ground Reaction Force
Authors: L. Parisi
Abstract:
In this research work, neural networks were applied to classify two types of hip joint implants based on the relative hip joint implant side speed and three components of each ground reaction force. The condition of walking gait at normal velocity was used and carried out with each of the two hip joint implants assessed. Ground reaction forces’ kinetic temporal changes were considered in the first approach followed but discarded in the second one. Ground reaction force components were obtained from eighteen patients under such gait condition, half of which had a hip implant type I-II, whilst the other half had the hip implant, defined as type III by Orthoload®. After pre-processing raw gait kinetic data and selecting the time frames needed for the analysis, the ground reaction force components were used to train a MLP neural network, which learnt to distinguish the two hip joint implants in the abovementioned condition. Further to training, unknown hip implant side and ground reaction force components were presented to the neural networks, which assigned those features into the right class with a reasonably high accuracy for the hip implant type I-II and the type III. The results suggest that neural networks could be successfully applied in the performance assessment of hip joint implants.Keywords: kinemic gait data, neural networks, hip joint implant, hip arthroplasty, rehabilitation engineering
Procedia PDF Downloads 3541622 Sacidava and Its Role of Military Outpost in the Moesian Sector of the Danube Limes: Animal Food Resources and Landscape Reconstruction
Authors: Margareta Simina Stanc, Aurel Mototolea, Tiberiu Potarniche
Abstract:
Sacidava archeological site is located in Dobrudja region, Romania, on a hill on the right bank of the Danube - the Musait point, located at about 5 km north-east from Dunareni village. The place-name documents the fact that, prior to the Roman conquest, in the area, there was a Getic settlement. The location of the Sacidava was made possible by corroborating the data provided by the ancient sources with the epigraphic documents (the milial pillar during the time of Emperor Decius). The tegular findings attest that an infantry unit, cohors I Cilicum milliaria equitata, as well as detachments from Legio V Macedonica and Legio XI Claudia, were confined to Sacidava. During the period of the Dominion, the garrison of the fortification is the host of a cavalry unit: cuneus equitum scutariorum. In the immediate vicinity to the Roman fortress, to the east, were identified two other fortifications: a Getic settlement (4th-1st century B.C.) and an Early Medieval settlement (9th-10th century A.C.). The archaeological material recovered during the research is represented by ceramic forms such as amphoras, jugs, pots, cups, plates, to which are added oil lamps, some of them typologically new at the time of discovery. Local ceramic shapes were also founded, worked by hand or by wheel, considered un-Romanized or in the course of Romanization. During the time of the Principality, Sacidava it represented an important military outpost serving mainly the city of Tropaeum Traiani, controlling also the supply and transport on the Danube limes in the Moesic sector. This role will determine the development of the fortress and the appearance of extramuros civil structures, thus becoming an important landmark during the 5th-6th centuries A.C., becoming a representation of power of the Roman empire in an area of continuous conflict. During recent archaeological researches, faunal remains were recovered, and their analysis allowed to estimate the animal resources and subsistence practices (animal husbandry, hunting, fishing) in the settlement. The methodology was specific to archaeozoology, mainly consisting of anatomical, taxonomical, and taphonomical identifications, recording, and quantification of the data. The remains of domestic mammals have the highest proportion indicating the importance of animal husbandry; the predominant species are Bos taurus, Ovis aries/Capra hircus, and Sus domesticus. Fishing and hunting were of secondary importance in the subsistence economy of the community. Wild boar and the red deer were the most frequently hunted species. Just a few fish bones were recovered. Thus, the ancient city of Sacidava is proving to be an important element of cultural heritage of the south-eastern part of Romania, for whose conservation and enhancement efforts must be made, especially by landscape reconstruction.Keywords: archaeozoology, landscape reconstruction, limes, military outpost
Procedia PDF Downloads 3241621 Content-Based Image Retrieval Using HSV Color Space Features
Authors: Hamed Qazanfari, Hamid Hassanpour, Kazem Qazanfari
Abstract:
In this paper, a method is provided for content-based image retrieval. Content-based image retrieval system searches query an image based on its visual content in an image database to retrieve similar images. In this paper, with the aim of simulating the human visual system sensitivity to image's edges and color features, the concept of color difference histogram (CDH) is used. CDH includes the perceptually color difference between two neighboring pixels with regard to colors and edge orientations. Since the HSV color space is close to the human visual system, the CDH is calculated in this color space. In addition, to improve the color features, the color histogram in HSV color space is also used as a feature. Among the extracted features, efficient features are selected using entropy and correlation criteria. The final features extract the content of images most efficiently. The proposed method has been evaluated on three standard databases Corel 5k, Corel 10k and UKBench. Experimental results show that the accuracy of the proposed image retrieval method is significantly improved compared to the recently developed methods.Keywords: content-based image retrieval, color difference histogram, efficient features selection, entropy, correlation
Procedia PDF Downloads 2491620 Numerical Study of Steel Structures Responses to External Explosions
Authors: Mohammad Abdallah
Abstract:
Due to the constant increase in terrorist attacks, the research and engineering communities have given significant attention to building performance under explosions. This paper presents a methodology for studying and simulating the dynamic responses of steel structures during external detonations, particularly for accurately investigating the impact of incrementing charge weight on the members total behavior, resistance and failure. Prediction damage method was introduced to evaluate the damage level of the steel members based on five scenarios of explosions. Johnson–Cook strength and failure model have been used as well as ABAQUS finite element code to simulate the explicit dynamic analysis, and antecedent field tests were used to verify the acceptance and accuracy of the proposed material strength and failure model. Based on the structural response, evaluation criteria such as deflection, vertical displacement, drift index, and damage level; the obtained results show the vulnerability of steel columns and un-braced steel frames which are designed and optimized to carry dead and live load to resist and endure blast loading.Keywords: steel structure, blast load, terrorist attacks, charge weight, damage level
Procedia PDF Downloads 3641619 Time-Dependent Behavior of Damaged Reinforced Concrete Shear Walls Strengthened with Composite Plates Having Variable Fibers Spacing
Authors: Redha Yeghnem, Laid Boulefrakh, Sid Ahmed Meftah, Abdelouahed Tounsi, El Abbas Adda Bedia
Abstract:
In this study, the time-dependent behavior of damaged reinforced concrete shear wall structures strengthened with composite plates having variable fibers spacing was investigated to analyze their seismic response. In the analytical formulation, the adherent and the adhesive layers are all modeled as shear walls, using the mixed finite element method (FEM). The anisotropic damage model is adopted to describe the damage extent of the RC shear walls. The phenomenon of creep and shrinkage of concrete has been determined by Eurocode 2. Large earthquakes recorded in Algeria (El-Asnam and Boumerdes) have been tested to demonstrate the accuracy of the proposed method. Numerical results are obtained for non uniform distributions of carbon fibers in epoxy matrices. The effects of damage extent and the delay mechanism creep and shrinkage of concrete are highlighted. Prospects are being studied.Keywords: RC shear wall structures, composite plates, creep and shrinkage, damaged reinforced concrete structures, finite element method
Procedia PDF Downloads 3651618 Applied Complement of Probability and Information Entropy for Prediction in Student Learning
Authors: Kennedy Efosa Ehimwenma, Sujatha Krishnamoorthy, Safiya Al‑Sharji
Abstract:
The probability computation of events is in the interval of [0, 1], which are values that are determined by the number of outcomes of events in a sample space S. The probability Pr(A) that an event A will never occur is 0. The probability Pr(B) that event B will certainly occur is 1. This makes both events A and B a certainty. Furthermore, the sum of probabilities Pr(E₁) + Pr(E₂) + … + Pr(Eₙ) of a finite set of events in a given sample space S equals 1. Conversely, the difference of the sum of two probabilities that will certainly occur is 0. This paper first discusses Bayes, the complement of probability, and the difference of probability for occurrences of learning-events before applying them in the prediction of learning objects in student learning. Given the sum of 1; to make a recommendation for student learning, this paper proposes that the difference of argMaxPr(S) and the probability of student-performance quantifies the weight of learning objects for students. Using a dataset of skill-set, the computational procedure demonstrates i) the probability of skill-set events that have occurred that would lead to higher-level learning; ii) the probability of the events that have not occurred that requires subject-matter relearning; iii) accuracy of the decision tree in the prediction of student performance into class labels and iv) information entropy about skill-set data and its implication on student cognitive performance and recommendation of learning.Keywords: complement of probability, Bayes’ rule, prediction, pre-assessments, computational education, information theory
Procedia PDF Downloads 1611617 Integrating Wound Location Data with Deep Learning for Improved Wound Classification
Authors: Mouli Banga, Chaya Ravindra
Abstract:
Wound classification is a crucial step in wound diagnosis. An effective classifier can aid wound specialists in identifying wound types with reduced financial and time investments, facilitating the determination of optimal treatment procedures. This study presents a deep neural network-based classifier that leverages wound images and their corresponding locations to categorize wounds into various classes, such as diabetic, pressure, surgical, and venous ulcers. By incorporating a developed body map, the process of tagging wound locations is significantly enhanced, providing healthcare specialists with a more efficient tool for wound analysis. We conducted a comparative analysis between two prominent convolutional neural network models, ResNet50 and MobileNetV2, utilizing a dataset of 730 images. Our findings reveal that the RestNet50 outperforms MovileNetV2, achieving an accuracy of approximately 90%, compared to MobileNetV2’s 83%. This disparity highlights the superior capability of ResNet50 in the context of this dataset. The results underscore the potential of integrating deep learning with spatial data to improve the precision and efficiency of wound diagnosis, ultimately contributing to better patient outcomes and reducing healthcare costs.Keywords: wound classification, MobileNetV2, ResNet50, multimodel
Procedia PDF Downloads 321616 Clustering Categorical Data Using the K-Means Algorithm and the Attribute’s Relative Frequency
Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami
Abstract:
Clustering is a well known data mining technique used in pattern recognition and information retrieval. The initial dataset to be clustered can either contain categorical or numeric data. Each type of data has its own specific clustering algorithm. In this context, two algorithms are proposed: the k-means for clustering numeric datasets and the k-modes for categorical datasets. The main encountered problem in data mining applications is clustering categorical dataset so relevant in the datasets. One main issue to achieve the clustering process on categorical values is to transform the categorical attributes into numeric measures and directly apply the k-means algorithm instead the k-modes. In this paper, it is proposed to experiment an approach based on the previous issue by transforming the categorical values into numeric ones using the relative frequency of each modality in the attributes. The proposed approach is compared with a previously method based on transforming the categorical datasets into binary values. The scalability and accuracy of the two methods are experimented. The obtained results show that our proposed method outperforms the binary method in all cases.Keywords: clustering, unsupervised learning, pattern recognition, categorical datasets, knowledge discovery, k-means
Procedia PDF Downloads 2591615 Superparamagnetic Core Shell Catalysts for the Environmental Production of Fuels from Renewable Lignin
Authors: Cristina Opris, Bogdan Cojocaru, Madalina Tudorache, Simona M. Coman, Vasile I. Parvulescu, Camelia Bala, Bahir Duraki, Jeroen A. Van Bokhoven
Abstract:
The tremendous achievements in the development of the society concretized by more sophisticated materials and systems are merely based on non-renewable resources. Consequently, after more than two centuries of intensive development, among others, we are faced with the decrease of the fossil fuel reserves, an increased impact of the greenhouse gases on the environment, and economic effects caused by the fluctuations in oil and mineral resource prices. The use of biomass may solve part of these problems, and recent analyses demonstrated that from the perspective of the reduction of the emissions of carbon dioxide, its valorization may bring important advantages conditioned by the usage of genetic modified fast growing trees or wastes, as primary sources. In this context, the abundance and complex structure of lignin may offer various possibilities of exploitation. However, its transformation in fuels or chemicals supposes a complex chemistry involving the cleavage of C-O and C-C bonds and altering of the functional groups. Chemistry offered various solutions in this sense. However, despite the intense work, there are still many drawbacks limiting the industrial application. Thus, the proposed technologies considered mainly homogeneous catalysts meaning expensive noble metals based systems that are hard to be recovered at the end of the reaction. Also, the reactions were carried out in organic solvents that are not acceptable today from the environmental point of view. To avoid these problems, the concept of this work was to investigate the synthesis of superparamagnetic core shell catalysts for the fragmentation of lignin directly in the aqueous phase. The magnetic nanoparticles were covered with a nanoshell of an oxide (niobia) with a double role: to protect the magnetic nanoparticles and to generate a proper (acidic) catalytic function and, on this composite, cobalt nanoparticles were deposed in order to catalyze the C-C bond splitting. With this purpose, we developed a protocol to prepare multifunctional and magnetic separable nano-composite Co@Nb2O5@Fe3O4 catalysts. We have also established an analytic protocol for the identification and quantification of the fragments resulted from lignin depolymerization in both liquid and solid phase. The fragmentation of various lignins occurred on the prepared materials in high yields and with very good selectivity in the desired fragments. The optimization of the catalyst composition indicated a cobalt loading of 4wt% as optimal. Working at 180 oC and 10 atm H2 this catalyst allowed a conversion of lignin up to 60% leading to a mixture containing over 96% in C20-C28 and C29-C37 fragments that were then completely fragmented to C12-C16 in a second stage. The investigated catalysts were completely recyclable, and no leaching of the elements included in the composition was determined by inductively coupled plasma optical emission spectrometry (ICP-OES).Keywords: superparamagnetic core-shell catalysts, environmental production of fuels, renewable lignin, recyclable catalysts
Procedia PDF Downloads 3281614 MITOS-RCNN: Mitotic Figure Detection in Breast Cancer Histopathology Images Using Region Based Convolutional Neural Networks
Authors: Siddhant Rao
Abstract:
Studies estimate that there will be 266,120 new cases of invasive breast cancer and 40,920 breast cancer induced deaths in the year of 2018 alone. Despite the pervasiveness of this affliction, the current process to obtain an accurate breast cancer prognosis is tedious and time consuming. It usually requires a trained pathologist to manually examine histopathological images and identify the features that characterize various cancer severity levels. We propose MITOS-RCNN: a region based convolutional neural network (RCNN) geared for small object detection to accurately grade one of the three factors that characterize tumor belligerence described by the Nottingham Grading System: mitotic count. Other computational approaches to mitotic figure counting and detection do not demonstrate ample recall or precision to be clinically viable. Our models outperformed all previous participants in the ICPR 2012 challenge, the AMIDA 2013 challenge and the MITOS-ATYPIA-14 challenge along with recently published works. Our model achieved an F- measure score of 0.955, a 6.11% improvement in accuracy from the most accurate of the previously proposed models.Keywords: breast cancer, mitotic count, machine learning, convolutional neural networks
Procedia PDF Downloads 2231613 Insights of Interaction Studies between HSP-60, HSP-70 Proteins and HSF-1 in Bubalus bubalis
Authors: Ravinder Singh, C Rajesh, Saroj Badhan, Shailendra Mishra, Ranjit Singh Kataria
Abstract:
Heat shock protein 60 and 70 are crucial chaperones that guide appropriate folding of denatured proteins under heat stress conditions. HSP60 and HSP70 provide assistance in correct folding of a multitude of denatured proteins. The heat shock factors are the family of some transcription factors which controls the regulation of gene expression of proteins involved in folding of damaged or improper folded proteins during stress conditions. Under normal condition heat shock proteins bind with HSF-1 and act as its repressor as well as aids in maintaining the HSF-1’s nonactive and monomeric confirmation. The experimental protein structure for all these proteins in Bubalus bubalis is not known till date. Therefore computational approach was explored to identify three-dimensional structure analysis of all these proteins. In this study, an extensive in silico analysis has been performed including sequence comparison among species to comparative modeling of Bubalus bubalis HSP60, HSP70 and HSF-1 protein. The stereochemical properties of proteins were assessed by utilizing several scrutiny bioinformatics tools to ensure model accuracy. Further docking approach was used to study interactions between Heat shock proteins and HSF-1.Keywords: Bubalus bubalis, comparative modelling, docking, heat shock protein
Procedia PDF Downloads 3221612 Churn Prediction for Savings Bank Customers: A Machine Learning Approach
Authors: Prashant Verma
Abstract:
Commercial banks are facing immense pressure, including financial disintermediation, interest rate volatility and digital ways of finance. Retaining an existing customer is 5 to 25 less expensive than acquiring a new one. This paper explores customer churn prediction, based on various statistical & machine learning models and uses under-sampling, to improve the predictive power of these models. The results show that out of the various machine learning models, Random Forest which predicts the churn with 78% accuracy, has been found to be the most powerful model for the scenario. Customer vintage, customer’s age, average balance, occupation code, population code, average withdrawal amount, and an average number of transactions were found to be the variables with high predictive power for the churn prediction model. The model can be deployed by the commercial banks in order to avoid the customer churn so that they may retain the funds, which are kept by savings bank (SB) customers. The article suggests a customized campaign to be initiated by commercial banks to avoid SB customer churn. Hence, by giving better customer satisfaction and experience, the commercial banks can limit the customer churn and maintain their deposits.Keywords: savings bank, customer churn, customer retention, random forests, machine learning, under-sampling
Procedia PDF Downloads 143