Search results for: immunoenzyme techniques
5941 A Review on Existing Challenges of Data Mining and Future Research Perspectives
Authors: Hema Bhardwaj, D. Srinivasa Rao
Abstract:
Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges
Procedia PDF Downloads 1105940 Performance Evaluation of Production Schedules Based on Process Mining
Authors: Kwan Hee Han
Abstract:
External environment of enterprise is rapidly changing majorly by global competition, cost reduction pressures, and new technology. In these situations, production scheduling function plays a critical role to meet customer requirements and to attain the goal of operational efficiency. It deals with short-term decision making in the production process of the whole supply chain. The major task of production scheduling is to seek a balance between customer orders and limited resources. In manufacturing companies, this task is so difficult because it should efficiently utilize resource capacity under the careful consideration of many interacting constraints. At present, many computerized software solutions have been utilized in many enterprises to generate a realistic production schedule to overcome the complexity of schedule generation. However, most production scheduling systems do not provide sufficient information about the validity of the generated schedule except limited statistics. Process mining only recently emerged as a sub-discipline of both data mining and business process management. Process mining techniques enable the useful analysis of a wide variety of processes such as process discovery, conformance checking, and bottleneck analysis. In this study, the performance of generated production schedule is evaluated by mining event log data of production scheduling software system by using the process mining techniques since every software system generates event logs for the further use such as security investigation, auditing and error bugging. An application of process mining approach is proposed for the validation of the goodness of production schedule generated by scheduling software systems in this study. By using process mining techniques, major evaluation criteria such as utilization of workstation, existence of bottleneck workstations, critical process route patterns, and work load balance of each machine over time are measured, and finally, the goodness of production schedule is evaluated. By using the proposed process mining approach for evaluating the performance of generated production schedule, the quality of production schedule of manufacturing enterprises can be improved.Keywords: data mining, event log, process mining, production scheduling
Procedia PDF Downloads 2795939 Analysing Techniques for Fusing Multimodal Data in Predictive Scenarios Using Convolutional Neural Networks
Authors: Philipp Ruf, Massiwa Chabbi, Christoph Reich, Djaffar Ould-Abdeslam
Abstract:
In recent years, convolutional neural networks (CNN) have demonstrated high performance in image analysis, but oftentimes, there is only structured data available regarding a specific problem. By interpreting structured data as images, CNNs can effectively learn and extract valuable insights from tabular data, leading to improved predictive accuracy and uncovering hidden patterns that may not be apparent in traditional structured data analysis. In applying a single neural network for analyzing multimodal data, e.g., both structured and unstructured information, significant advantages in terms of time complexity and energy efficiency can be achieved. Converting structured data into images and merging them with existing visual material offers a promising solution for applying CNN in multimodal datasets, as they often occur in a medical context. By employing suitable preprocessing techniques, structured data is transformed into image representations, where the respective features are expressed as different formations of colors and shapes. In an additional step, these representations are fused with existing images to incorporate both types of information. This final image is finally analyzed using a CNN.Keywords: CNN, image processing, tabular data, mixed dataset, data transformation, multimodal fusion
Procedia PDF Downloads 1235938 Normal and Peaberry Coffee Beans Classification from Green Coffee Bean Images Using Convolutional Neural Networks and Support Vector Machine
Authors: Hira Lal Gope, Hidekazu Fukai
Abstract:
The aim of this study is to develop a system which can identify and sort peaberries automatically at low cost for coffee producers in developing countries. In this paper, the focus is on the classification of peaberries and normal coffee beans using image processing and machine learning techniques. The peaberry is not bad and not a normal bean. The peaberry is born in an only single seed, relatively round seed from a coffee cherry instead of the usual flat-sided pair of beans. It has another value and flavor. To make the taste of the coffee better, it is necessary to separate the peaberry and normal bean before green coffee beans roasting. Otherwise, the taste of total beans will be mixed, and it will be bad. In roaster procedure time, all the beans shape, size, and weight must be unique; otherwise, the larger bean will take more time for roasting inside. The peaberry has a different size and different shape even though they have the same weight as normal beans. The peaberry roasts slower than other normal beans. Therefore, neither technique provides a good option to select the peaberries. Defect beans, e.g., sour, broken, black, and fade bean, are easy to check and pick up manually by hand. On the other hand, the peaberry pick up is very difficult even for trained specialists because the shape and color of the peaberry are similar to normal beans. In this study, we use image processing and machine learning techniques to discriminate the normal and peaberry bean as a part of the sorting system. As the first step, we applied Deep Convolutional Neural Networks (CNN) and Support Vector Machine (SVM) as machine learning techniques to discriminate the peaberry and normal bean. As a result, better performance was obtained with CNN than with SVM for the discrimination of the peaberry. The trained artificial neural network with high performance CPU and GPU in this work will be simply installed into the inexpensive and low in calculation Raspberry Pi system. We assume that this system will be used in under developed countries. The study evaluates and compares the feasibility of the methods in terms of accuracy of classification and processing speed.Keywords: convolutional neural networks, coffee bean, peaberry, sorting, support vector machine
Procedia PDF Downloads 1445937 Study of Two MPPTs for Photovoltaic Systems Using Controllers Based in Fuzzy Logic and Sliding Mode
Authors: N. Ould cherchali, M. S. Boucherit, L. Barazane, A. Morsli
Abstract:
Photovoltaic power is widely used to supply isolated or unpopulated areas (lighting, pumping, etc.). Great advantage is that this source is inexhaustible, it offers great safety in use and it is clean. But the dynamic models used to describe a photovoltaic system are complicated and nonlinear and due to nonlinear I-V and P–V characteristics of photovoltaic generators, a maximum power point tracking technique (MPPT) is required to maximize the output power. In this paper, two online techniques of maximum power point tracking using robust controller for photovoltaic systems are proposed, the first technique use fuzzy logic controller (FLC) and the second use sliding mode controller (SMC) for photovoltaic systems. The two maximum power point tracking controllers receive the partial derivative of power as inputs, and the output is the duty cycle corresponding to maximum power. A Photovoltaic generator with Boost converter is developed using MATLAB/Simulink to verify the preferences of the proposed techniques. SMC technique provides a good tracking speed in fast changing irradiation and when the irradiation changes slowly or is constant the panel power of FLC technique presents a much smoother signal with less fluctuations.Keywords: fuzzy logic controller, maximum power point, photovoltaic system, tracker, sliding mode controller
Procedia PDF Downloads 5475936 Comparing Image Processing and AI Techniques for Disease Detection in Plants
Authors: Luiz Daniel Garay Trindade, Antonio De Freitas Valle Neto, Fabio Paulo Basso, Elder De Macedo Rodrigues, Maicon Bernardino, Daniel Welfer, Daniel Muller
Abstract:
Agriculture plays an important role in society since it is one of the main sources of food in the world. To help the production and yield of crops, precision agriculture makes use of technologies aiming at improving productivity and quality of agricultural commodities. One of the problems hampering quality of agricultural production is the disease affecting crops. Failure in detecting diseases in a short period of time can result in small or big damages to production, causing financial losses to farmers. In order to provide a map of the contributions destined to the early detection of plant diseases and a comparison of the accuracy of the selected studies, a systematic literature review of the literature was performed, showing techniques for digital image processing and neural networks. We found 35 interesting tool support alternatives to detect disease in 19 plants. Our comparison of these studies resulted in an overall average accuracy of 87.45%, with two studies very closer to obtain 100%.Keywords: pattern recognition, image processing, deep learning, precision agriculture, smart farming, agricultural automation
Procedia PDF Downloads 3785935 Process for Separating and Recovering Materials from Kerf Slurry Waste
Authors: Tarik Ouslimane, Abdenour Lami, Salaheddine Aoudj, Mouna Hecini, Ouahiba Bouchelaghem, Nadjib Drouiche
Abstract:
Slurry waste is a byproduct generated from the slicing process of multi-crystalline silicon ingots. This waste can be used as a secondary resource to recover high purity silicon which has a great economic value. From the management perspective, the ever increasing generation of kerf slurry waste loss leads to significant challenges for the photovoltaic industry due to the current low use of slurry waste for silicon recovery. Slurry waste, in most cases, contains silicon, silicon carbide, metal fragments and mineral-oil-based or glycol-based slurry vehicle. As a result, of the global scarcity of high purity silicon supply, the high purity silicon content in slurry has increasingly attracted interest for research. This paper presents a critical overview of the current techniques employed for high purity silicon recovery from kerf slurry waste. Hydrometallurgy is continuously a matter of study and research. However, in this review paper, several new techniques about the process of high purity silicon recovery from slurry waste are introduced. The purpose of the information presented is to improve the development of a clean and effective recovery process of high purity silicon from slurry waste.Keywords: Kerf-loss, slurry waste, silicon carbide, silicon recovery, photovoltaic, high purity silicon, polyethylen glycol
Procedia PDF Downloads 3105934 A Cloud Computing System Using Virtual Hyperbolic Coordinates for Services Distribution
Authors: Telesphore Tiendrebeogo, Oumarou Sié
Abstract:
Cloud computing technologies have attracted considerable interest in recent years. Thus, these latters have become more important for many existing database applications. It provides a new mode of use and of offer of IT resources in general. Such resources can be used “on demand” by anybody who has access to the internet. Particularly, the Cloud platform provides an ease to use interface between providers and users, allow providers to develop and provide software and databases for users over locations. Currently, there are many Cloud platform providers support large scale database services. However, most of these only support simple keyword-based queries and can’t response complex query efficiently due to lack of efficient in multi-attribute index techniques. Existing Cloud platform providers seek to improve performance of indexing techniques for complex queries. In this paper, we define a new cloud computing architecture based on a Distributed Hash Table (DHT) and design a prototype system. Next, we perform and evaluate our cloud computing indexing structure based on a hyperbolic tree using virtual coordinates taken in the hyperbolic plane. We show through our experimental results that we compare with others clouds systems to show our solution ensures consistence and scalability for Cloud platform.Keywords: virtual coordinates, cloud, hyperbolic plane, storage, scalability, consistency
Procedia PDF Downloads 4255933 Control Flow around NACA 4415 Airfoil Using Slot and Injection
Authors: Imine Zakaria, Meftah Sidi Mohamed El Amine
Abstract:
One of the most vital aerodynamic organs of a flying machine is the wing, which allows it to fly in the air efficiently. The flow around the wing is very sensitive to changes in the angle of attack. Beyond a value, there is a phenomenon of the boundary layer separation on the upper surface, which causes instability and total degradation of aerodynamic performance called a stall. However, controlling flow around an airfoil has become a researcher concern in the aeronautics field. There are two techniques for controlling flow around a wing to improve its aerodynamic performance: passive and active controls. Blowing and suction are among the active techniques that control the boundary layer separation around an airfoil. Their objective is to give energy to the air particles in the boundary layer separation zones and to create vortex structures that will homogenize the velocity near the wall and allow control. Blowing and suction have long been used as flow control actuators around obstacles. In 1904 Prandtl applied a permanent blowing to a cylinder to delay the boundary layer separation. In the present study, several numerical investigations have been developed to predict a turbulent flow around an aerodynamic profile. CFD code was used for several angles of attack in order to validate the present work with that of the literature in the case of a clean profile. The variation of the lift coefficient CL with the momentum coefficientKeywords: CFD, control flow, lift, slot
Procedia PDF Downloads 1975932 Data Science-Based Key Factor Analysis and Risk Prediction of Diabetic
Authors: Fei Gao, Rodolfo C. Raga Jr.
Abstract:
This research proposal will ascertain the major risk factors for diabetes and to design a predictive model for risk assessment. The project aims to improve diabetes early detection and management by utilizing data science techniques, which may improve patient outcomes and healthcare efficiency. The phase relation values of each attribute were used to analyze and choose the attributes that might influence the examiner's survival probability using Diabetes Health Indicators Dataset from Kaggle’s data as the research data. We compare and evaluate eight machine learning algorithms. Our investigation begins with comprehensive data preprocessing, including feature engineering and dimensionality reduction, aimed at enhancing data quality. The dataset, comprising health indicators and medical data, serves as a foundation for training and testing these algorithms. A rigorous cross-validation process is applied, and we assess their performance using five key metrics like accuracy, precision, recall, F1-score, and area under the receiver operating characteristic curve (AUC-ROC). After analyzing the data characteristics, investigate their impact on the likelihood of diabetes and develop corresponding risk indicators.Keywords: diabetes, risk factors, predictive model, risk assessment, data science techniques, early detection, data analysis, Kaggle
Procedia PDF Downloads 755931 A Nucleic Acid Extraction Method for High-Viscosity Floricultural Samples
Authors: Harunori Kawabe, Hideyuki Aoshima, Koji Murakami, Minoru Kawakami, Yuka Nakano, David D. Ordinario, C. W. Crawford, Iri Sato-Baran
Abstract:
With the recent advances in gene editing technologies allowing the rewriting of genetic sequences, additional market growth in the global floriculture market beyond previous trends is anticipated through increasingly sophisticated plant breeding techniques. As a prerequisite for gene editing, the gene sequence of the target plant must first be identified. This necessitates the genetic analysis of plants with unknown gene sequences, the extraction of RNA, and comprehensive expression analysis. Consequently, a technology capable of consistently and effectively extracting high-purity DNA and RNA from plants is of paramount importance. Although model plants, such as Arabidopsis and tobacco, have established methods for DNA and RNA extraction, floricultural species such as roses present unique challenges. Different techniques to extract DNA and RNA from various floricultural species were investigated. Upon sampling and grinding the petals of several floricultural species, it was observed that nucleic acid extraction from the ground petal solutions of low viscosity was straightforward; solutions of high viscosity presented a significant challenge. It is postulated that the presence of substantial quantities of polysaccharides and polyphenols in the plant tissue was responsible for the inhibition of nucleic acid extraction. Consequently, attempts were made to extract high-purity DNA and RNA by improving the CTAB method and combining it with commercially available nucleic acid extraction kits. The quality of the total extracted DNA and RNA was evaluated using standard methods. Finally, the effectiveness of the extraction method was assessed by determining whether it was possible to create a library that could be applied as a suitable template for a next-generation sequencer. In conclusion, a method was developed for consistent and accurate nucleic acid extraction from high-viscosity floricultural samples. These results demonstrate improved techniques for DNA and RNA extraction from flowers, help facilitate gene editing of floricultural species and expand the boundaries of research and commercial opportunities.Keywords: floriculture, gene editing, next-generation sequencing, nucleic acid extraction
Procedia PDF Downloads 295930 Multiobjective Optimization of a Pharmaceutical Formulation Using Regression Method
Authors: J. Satya Eswari, Ch. Venkateswarlu
Abstract:
The formulation of a commercial pharmaceutical product involves several composition factors and response characteristics. When the formulation requires to satisfy multiple response characteristics which are conflicting, an optimal solution requires the need for an efficient multiobjective optimization technique. In this work, a regression is combined with a non-dominated sorting differential evolution (NSDE) involving Naïve & Slow and ε constraint techniques to derive different multiobjective optimization strategies, which are then evaluated by means of a trapidil pharmaceutical formulation. The analysis of the results show the effectiveness of the strategy that combines the regression model and NSDE with the integration of both Naïve & Slow and ε constraint techniques for Pareto optimization of trapidil formulation. With this strategy, the optimal formulation at pH=6.8 is obtained with the decision variables of micro crystalline cellulose, hydroxypropyl methylcellulose and compression pressure. The corresponding response characteristics of rate constant and release order are also noted down. The comparison of these results with the experimental data and with those of other multiple regression model based multiobjective evolutionary optimization strategies signify the better performance for optimal trapidil formulation.Keywords: pharmaceutical formulation, multiple regression model, response surface method, radial basis function network, differential evolution, multiobjective optimization
Procedia PDF Downloads 4095929 Synthesis and Characterization of Functionalized Carbon Nanorods/Polystyrene Nanocomposites
Authors: M. A. Karakassides, M. Baikousi, A. Kouloumpis, D. Gournis
Abstract:
Nanocomposites of Carbon Nanorods (CNRs) with Polystyrene (PS), have been synthesized successfully by means of in situ polymerization process and characterized. Firstly, carbon nanorods with graphitic structure were prepared by the standard synthetic procedure of CMK-3 using MCM-41 as template, instead of SBA-15, and sucrose as carbon source. In order to create an organophilic surface on CNRs, two parts of modification were realized: surface chemical oxidation (CNRs-ox) according to the Staudenmaier’s method and the attachment of octadecylamine molecules on the functional groups of CNRs-ox (CNRs-ODA The nanocomposite materials of polystyrene with CNRs-ODA, were prepared by a solution-precipitation method at three nanoadditive to polymer loadings (1, 3 and 5 wt. %). The as derived nanocomposites were studied with a combination of characterization and analytical techniques. Especially, Fourier-transform infrared (FT-IR) and Raman spectroscopies were used for the chemical and structural characterization of the pristine materials and the derived nanocomposites while the morphology of nanocomposites and the dispersion of the carbon nanorods were analyzed by atomic force and scanning electron microscopy techniques. Tensile testing and thermogravimetric analysis (TGA) along with differential scanning calorimetry (DSC) were also used to examine the mechanical properties and thermal stability -glass transition temperature of PS after the incorporation of CNRs-ODA nanorods. The results showed that the thermal and mechanical properties of the PS/ CNRs-ODA nanocomposites gradually improved with increasing of CNRs-ODA loading.Keywords: nanocomposites, polystyrene, carbon, nanorods
Procedia PDF Downloads 3525928 A Comprehensive Survey on Machine Learning Techniques and User Authentication Approaches for Credit Card Fraud Detection
Authors: Niloofar Yousefi, Marie Alaghband, Ivan Garibay
Abstract:
With the increase of credit card usage, the volume of credit card misuse also has significantly increased, which may cause appreciable financial losses for both credit card holders and financial organizations issuing credit cards. As a result, financial organizations are working hard on developing and deploying credit card fraud detection methods, in order to adapt to ever-evolving, increasingly sophisticated defrauding strategies and identifying illicit transactions as quickly as possible to protect themselves and their customers. Compounding on the complex nature of such adverse strategies, credit card fraudulent activities are rare events compared to the number of legitimate transactions. Hence, the challenge to develop fraud detection that are accurate and efficient is substantially intensified and, as a consequence, credit card fraud detection has lately become a very active area of research. In this work, we provide a survey of current techniques most relevant to the problem of credit card fraud detection. We carry out our survey in two main parts. In the first part, we focus on studies utilizing classical machine learning models, which mostly employ traditional transnational features to make fraud predictions. These models typically rely on some static physical characteristics, such as what the user knows (knowledge-based method), or what he/she has access to (object-based method). In the second part of our survey, we review more advanced techniques of user authentication, which use behavioral biometrics to identify an individual based on his/her unique behavior while he/she is interacting with his/her electronic devices. These approaches rely on how people behave (instead of what they do), which cannot be easily forged. By providing an overview of current approaches and the results reported in the literature, this survey aims to drive the future research agenda for the community in order to develop more accurate, reliable and scalable models of credit card fraud detection.Keywords: Credit Card Fraud Detection, User Authentication, Behavioral Biometrics, Machine Learning, Literature Survey
Procedia PDF Downloads 1215927 Evaluation of a Data Fusion Algorithm for Detecting and Locating a Radioactive Source through Monte Carlo N-Particle Code Simulation and Experimental Measurement
Authors: Hadi Ardiny, Amir Mohammad Beigzadeh
Abstract:
Through the utilization of a combination of various sensors and data fusion methods, the detection of potential nuclear threats can be significantly enhanced by extracting more information from different data. In this research, an experimental and modeling approach was employed to track a radioactive source by combining a surveillance camera and a radiation detector (NaI). To run this experiment, three mobile robots were utilized, with one of them equipped with a radioactive source. An algorithm was developed in identifying the contaminated robot through correlation between camera images and camera data. The computer vision method extracts the movements of all robots in the XY plane coordinate system, and the detector system records the gamma-ray count. The position of the robots and the corresponding count of the moving source were modeled using the MCNPX simulation code while considering the experimental geometry. The results demonstrated a high level of accuracy in finding and locating the target in both the simulation model and experimental measurement. The modeling techniques prove to be valuable in designing different scenarios and intelligent systems before initiating any experiments.Keywords: nuclear threats, radiation detector, MCNPX simulation, modeling techniques, intelligent systems
Procedia PDF Downloads 1235926 Bacterial Flora of the Anopheles Fluviatilis S. L. in an Endemic Malaria Area in Southeastern Iran for Candidate Paraterasgenesis Strains
Authors: Seyed Hassan Moosa-kazemi, Jalal Mohammadi Soleimani, Hassan Vatandoost, Mohammad Hassan Shirazi, Sara Hajikhani, Roonak Bakhtiari, Morteza Akbari, Siamak Hydarzadeh
Abstract:
Malaria is an infectious disease and considered most important health problems in the southeast of Iran. Iran is elimination malaria phase and new tool need to vector control. Paraterasgenesis is a new way to cut of life cycle of the malaria parasite. In this study, the microflora of the surface and gut of various stages of Anopheles fluviatilis James as one of the important malaria vector was studied using biochemical and molecular techniques during 2013-2014. Twelve bacteria species were found including; Providencia rettgeri, Morganella morganii, Enterobacter aerogenes, Pseudomonas oryzihabitans, Citrobacter braakii، Citrobacter freundii، Aeromonas hydrophila، Klebsiella oxytoca, Citrobacter koseri, Serratia fonticola، Enterobacter sakazakii and Yersinia pseudotuberculosis. The species of Alcaligenes faecalis, Providencia vermicola and Enterobacter hormaechei were identified in various stages of the vector and confirmed by biochemical and molecular techniques. We found Providencia rettgeri proper candidate for paratransgenesis.Keywords: Anopheles fluviatilis, bacteria, malaria, Paraterasgenesis, Southern Iran
Procedia PDF Downloads 4915925 Fiber Stiffness Detection of GFRP Using Combined ABAQUS and Genetic Algorithms
Authors: Gyu-Dong Kim, Wuk-Jae Yoo, Sang-Youl Lee
Abstract:
Composite structures offer numerous advantages over conventional structural systems in the form of higher specific stiffness and strength, lower life-cycle costs, and benefits such as easy installation and improved safety. Recently, there has been a considerable increase in the use of composites in engineering applications and as wraps for seismic upgrading and repairs. However, these composites deteriorate with time because of outdated materials, excessive use, repetitive loading, climatic conditions, manufacturing errors, and deficiencies in inspection methods. In particular, damaged fibers in a composite result in significant degradation of structural performance. In order to reduce the failure probability of composites in service, techniques to assess the condition of the composites to prevent continual growth of fiber damage are required. Condition assessment technology and nondestructive evaluation (NDE) techniques have provided various solutions for the safety of structures by means of detecting damage or defects from static or dynamic responses induced by external loading. A variety of techniques based on detecting the changes in static or dynamic behavior of isotropic structures has been developed in the last two decades. These methods, based on analytical approaches, are limited in their capabilities in dealing with complex systems, primarily because of their limitations in handling different loading and boundary conditions. Recently, investigators have introduced direct search methods based on metaheuristics techniques and artificial intelligence, such as genetic algorithms (GA), simulated annealing (SA) methods, and neural networks (NN), and have promisingly applied these methods to the field of structural identification. Among them, GAs attract our attention because they do not require a considerable amount of data in advance in dealing with complex problems and can make a global solution search possible as opposed to classical gradient-based optimization techniques. In this study, we propose an alternative damage-detection technique that can determine the degraded stiffness distribution of vibrating laminated composites made of Glass Fiber-reinforced Polymer (GFRP). The proposed method uses a modified form of the bivariate Gaussian distribution function to detect degraded stiffness characteristics. In addition, this study presents a method to detect the fiber property variation of laminated composite plates from the micromechanical point of view. The finite element model is used to study free vibrations of laminated composite plates for fiber stiffness degradation. In order to solve the inverse problem using the combined method, this study uses only first mode shapes in a structure for the measured frequency data. In particular, this study focuses on the effect of the interaction among various parameters, such as fiber angles, layup sequences, and damage distributions, on fiber-stiffness damage detection.Keywords: stiffness detection, fiber damage, genetic algorithm, layup sequences
Procedia PDF Downloads 2725924 Competition between Regression Technique and Statistical Learning Models for Predicting Credit Risk Management
Authors: Chokri Slim
Abstract:
The objective of this research is attempting to respond to this question: Is there a significant difference between the regression model and statistical learning models in predicting credit risk management? A Multiple Linear Regression (MLR) model was compared with neural networks including Multi-Layer Perceptron (MLP), and a Support vector regression (SVR). The population of this study includes 50 listed Banks in Tunis Stock Exchange (TSE) market from 2000 to 2016. Firstly, we show the factors that have significant effect on the quality of loan portfolios of banks in Tunisia. Secondly, it attempts to establish that the systematic use of objective techniques and methods designed to apprehend and assess risk when considering applications for granting credit, has a positive effect on the quality of loan portfolios of banks and their future collectability. Finally, we will try to show that the bank governance has an impact on the choice of methods and techniques for analyzing and measuring the risks inherent in the banking business, including the risk of non-repayment. The results of empirical tests confirm our claims.Keywords: credit risk management, multiple linear regression, principal components analysis, artificial neural networks, support vector machines
Procedia PDF Downloads 1505923 A Process FMEA in Aero Fuel Pump Manufacturing and Conduct the Corrective Actions
Authors: Zohre Soleymani, Meisam Amirzadeh
Abstract:
Many products are safety critical, so proactive analysis techniques are vital for them because these techniques try to identify potential failures before the products are produced. Failure Mode and Effective Analysis (FMEA) is an effective tool in identifying probable problems of product or process and prioritizing them and planning for its elimination. The paper shows the implementation of FMEA process to identify and remove potential troubles of aero fuel pumps manufacturing process and improve the reliability of subsystems. So the different possible causes of failure and its effects along with the recommended actions are discussed. FMEA uses Risk Priority Number (RPN) to determine the risk level. RPN value is depending on Severity(S), Occurrence (O) and Detection (D) parameters, so these parameters need to be determined. After calculating the RPN for identified potential failure modes, the corrective actions are defined to reduce risk level according to assessment strategy and determined acceptable risk level. Then FMEA process is performed again and RPN revised is calculated. The represented results are applied in the format of a case study. These results show the improvement in manufacturing process and considerable reduction in aero fuel pump production risk level.Keywords: FMEA, risk priority number, aero pump, corrective action
Procedia PDF Downloads 2855922 Fuzzy Neuro Approach for Integrated Water Management System
Authors: Stuti Modi, Aditi Kambli
Abstract:
This paper addresses the need for intelligent water management and distribution system in smart cities to ensure optimal consumption and distribution of water for drinking and sanitation purposes. Water being a limited resource in cities require an effective system for collection, storage and distribution. In this paper, applications of two mostly widely used particular types of data-driven models, namely artificial neural networks (ANN) and fuzzy logic-based models, to modelling in the water resources management field are considered. The objective of this paper is to review the principles of various types and architectures of neural network and fuzzy adaptive systems and their applications to integrated water resources management. Final goal of the review is to expose and formulate progressive direction of their applicability and further research of the AI-related and data-driven techniques application and to demonstrate applicability of the neural networks, fuzzy systems and other machine learning techniques in the practical issues of the regional water management. Apart from this the paper will deal with water storage, using ANN to find optimum reservoir level and predicting peak daily demands.Keywords: artificial neural networks, fuzzy systems, peak daily demand prediction, water management and distribution
Procedia PDF Downloads 1865921 Modelling and Simulation of Cascaded H-Bridge Multilevel Single Source Inverter Using PSIM
Authors: Gaddafi Sani Shehu, Tankut Yalcınoz, Abdullahi Bala Kunya
Abstract:
Multilevel inverters such as flying capacitor, diode-clamped, and cascaded H-bridge inverters are very popular particularly in medium and high power applications. This paper focuses on a cascaded H-bridge module using a single direct current (DC) source in order to generate an 11-level output voltage. The noble approach reduces the number of switches and gate drivers, in comparison with a conventional method. The anticipated topology produces more accurate result with an isolation transformer at high switching frequency. Different modulation techniques can be used for the multilevel inverter, but this work features modulation techniques known as selective harmonic elimination (SHE).This modulation approach reduces the number of carriers with reduction in Switching Losses, Total Harmonic Distortion (THD), and thereby increasing Power Quality (PQ). Based on the simulation result obtained, it appears SHE has the ability to eliminate selected harmonics by chopping off the fundamental output component. The performance evaluation of the proposed cascaded multilevel inverter is performed using PSIM simulation package and THD of 0.94% is obtained.Keywords: cascaded H-bridge multilevel inverter, power quality, selective harmonic elimination
Procedia PDF Downloads 4185920 Futuristic Black Box Design Considerations and Global Networking for Real Time Monitoring of Flight Performance Parameters
Authors: K. Parandhama Gowd
Abstract:
The aim of this research paper is to conceptualize, discuss, analyze and propose alternate design methodologies for futuristic Black Box for flight safety. The proposal also includes global networking concepts for real time surveillance and monitoring of flight performance parameters including GPS parameters. It is expected that this proposal will serve as a failsafe real time diagnostic tool for accident investigation and location of debris in real time. In this paper, an attempt is made to improve the existing methods of flight data recording techniques and improve upon design considerations for futuristic FDR to overcome the trauma of not able to locate the block box. Since modern day communications and information technologies with large bandwidth are available coupled with faster computer processing techniques, the attempt made in this paper to develop a failsafe recording technique is feasible. Further data fusion/data warehousing technologies are available for exploitation.Keywords: flight data recorder (FDR), black box, diagnostic tool, global networking, cockpit voice and data recorder (CVDR), air traffic control (ATC), air traffic, telemetry, tracking and control centers ATTTCC)
Procedia PDF Downloads 5725919 3D Human Reconstruction over Cloud Based Image Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Human action recognition modeling is a critical task in machine learning. These systems require better techniques for recognizing body parts and selecting optimal features based on vision sensors to identify complex action patterns efficiently. Still, there is a considerable gap and challenges between images and videos, such as brightness, motion variation, and random clutters. This paper proposes a robust approach for classifying human actions over cloud-based image data. First, we apply pre-processing and detection, human and outer shape detection techniques. Next, we extract valuable information in terms of cues. We extract two distinct features: fuzzy local binary patterns and sequence representation. Then, we applied a greedy, randomized adaptive search procedure for data optimization and dimension reduction, and for classification, we used a random forest. We tested our model on two benchmark datasets, AAMAZ and the KTH Multi-view football datasets. Our HMR framework significantly outperforms the other state-of-the-art approaches and achieves a better recognition rate of 91% and 89.6% over the AAMAZ and KTH multi-view football datasets, respectively.Keywords: computer vision, human motion analysis, random forest, machine learning
Procedia PDF Downloads 365918 Design of Transmit Beamspace and DOA Estimation in MIMO Radar
Authors: S. Ilakkiya, A. Merline
Abstract:
A multiple-input multiple-output (MIMO) radar systems use modulated waveforms and directive antennas to transmit electromagnetic energy into a specific volume in space to search for targets. This paper deals with the design of transmit beamspace matrix and DOA estimation for multiple-input multiple-output (MIMO) radar with collocated antennas.The design of transmit beamspace matrix is based on minimizing the difference between a desired transmit beampattern and the actual one while enforcing the constraint of uniform power distribution across the transmit array elements. Rotational invariance property is established at the transmit array by imposing a specific structure on the beamspace matrix. Semidefinite programming and spatial-division based design (SDD) are also designed separately. In MIMO radar systems, DOA estimation is an essential process to determine the direction of incoming signals and thus to direct the beam of the antenna array towards the estimated direction. This estimation deals with non-adaptive spectral estimation and adaptive spectral estimation techniques. The design of the transmit beamspace matrix and spectral estimation techniques are studied through simulation.Keywords: adaptive and non-adaptive spectral estimation, direction of arrival estimation, MIMO radar, rotational invariance property, transmit, receive beamforming
Procedia PDF Downloads 5195917 Combining Corpus Linguistics and Critical Discourse Analysis to Study Power Relations in Hindi Newspapers
Authors: Vandana Mishra, Niladri Sekhar Dash, Jayshree Charkraborty
Abstract:
This present paper focuses on the application of corpus linguistics techniques for critical discourse analysis (CDA) of Hindi newspapers. While Corpus linguistics is the study of language as expressed in corpora (samples) of 'real world' text, CDA is an interdisciplinary approach to the study of discourse that views language as a form of social practice. CDA has mainly been studied from a qualitative perspective. However, we can say that recent studies have begun combining corpus linguistics with CDA in analyzing large volumes of text for the study of existing power relations in society. The corpus under our study is also of a sizable amount (1 million words of Hindi newspaper texts) and its analysis requires an alternative analytical procedure. So, we have combined both the quantitative approach i.e. the use of corpus techniques with CDA’s traditional qualitative analysis. In this context, we have focused on the Keyword Analysis Sorting Concordance Lines of the selected Keywords and calculating collocates of the keywords. We have made use of the Wordsmith Tool for all these analysis. The analysis starts with identifying the keywords in the political news corpus when compared with the main news corpus. The keywords are extracted from the corpus based on their keyness calculated through statistical tests like chi-squared test and log-likelihood test on the frequent words of the corpus. Some of the top occurring keywords are मोदी (Modi), भाजपा (BJP), कांग्रेस (Congress), सरकार (Government) and पार्टी (Political party). This is followed by the concordance analysis of these keywords which generates thousands of lines but we have to select few lines and examine them based on our objective. We have also calculated the collocates of the keywords based on their Mutual Information (MI) score. Both concordance and collocation help to identify lexical patterns in the political texts. Finally, all these quantitative results derived from the corpus techniques will be subjectively interpreted in accordance to the CDA’s theory to examine the ways in which political news discourse produces social and political inequality, power abuse or domination.Keywords: critical discourse analysis, corpus linguistics, Hindi newspapers, power relations
Procedia PDF Downloads 2245916 From Industry 4.0 to Agriculture 4.0: A Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability
Authors: Angelo Corallo, Maria Elena Latino, Marta Menegoli
Abstract:
Agri-food value chain involves various stakeholders with different roles. All of them abide by national and international rules and leverage marketing strategies to advance their products. Food products and related processing phases carry with it a big mole of data that are often not used to inform final customer. Some data, if fittingly identified and used, can enhance the single company, and/or the all supply chain creates a math between marketing techniques and voluntary traceability strategies. Moreover, as of late, the world has seen buying-models’ modification: customer is careful on wellbeing and food quality. Food citizenship and food democracy was born, leveraging on transparency, sustainability and food information needs. Internet of Things (IoT) and Analytics, some of the innovative technologies of Industry 4.0, have a significant impact on market and will act as a main thrust towards a genuine ‘4.0 change’ for agriculture. But, realizing a traceability system is not simple because of the complexity of agri-food supply chain, a lot of actors involved, different business models, environmental variations impacting products and/or processes, and extraordinary climate changes. In order to give support to the company involved in a traceability path, starting from business model analysis and related business process a Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability was conceived. Studying each process task and leveraging on modeling techniques lead to individuate information held by different actors during agri-food supply chain. IoT technologies for data collection and Analytics techniques for data processing supply information useful to increase the efficiency intra-company and competitiveness in the market. The whole information recovered can be shown through IT solutions and mobile application to made accessible to the company, the entire supply chain and the consumer with the view to guaranteeing transparency and quality.Keywords: agriculture 4.0, agri-food suppy chain, industry 4.0, voluntary traceability
Procedia PDF Downloads 1475915 Monitoring Blood Pressure Using Regression Techniques
Authors: Qasem Qananwah, Ahmad Dagamseh, Hiam AlQuran, Khalid Shaker Ibrahim
Abstract:
Blood pressure helps the physicians greatly to have a deep insight into the cardiovascular system. The determination of individual blood pressure is a standard clinical procedure considered for cardiovascular system problems. The conventional techniques to measure blood pressure (e.g. cuff method) allows a limited number of readings for a certain period (e.g. every 5-10 minutes). Additionally, these systems cause turbulence to blood flow; impeding continuous blood pressure monitoring, especially in emergency cases or critically ill persons. In this paper, the most important statistical features in the photoplethysmogram (PPG) signals were extracted to estimate the blood pressure noninvasively. PPG signals from more than 40 subjects were measured and analyzed and 12 features were extracted. The features were fed to principal component analysis (PCA) to find the most important independent features that have the highest correlation with blood pressure. The results show that the stiffness index means and standard deviation for the beat-to-beat heart rate were the most important features. A model representing both features for Systolic Blood Pressure (SBP) and Diastolic Blood Pressure (DBP) was obtained using a statistical regression technique. Surface fitting is used to best fit the series of data and the results show that the error value in estimating the SBP is 4.95% and in estimating the DBP is 3.99%.Keywords: blood pressure, noninvasive optical system, principal component analysis, PCA, continuous monitoring
Procedia PDF Downloads 1615914 Statistical Discrimination of Blue Ballpoint Pen Inks by Diamond Attenuated Total Reflectance (ATR) FTIR
Authors: Mohamed Izzharif Abdul Halim, Niamh Nic Daeid
Abstract:
Determining the source of pen inks used on a variety of documents is impartial for forensic document examiners. The examination of inks is often performed to differentiate between inks in order to evaluate the authenticity of a document. A ballpoint pen ink consists of synthetic dyes in (acidic and/or basic), pigments (organic and/or inorganic) and a range of additives. Inks of similar color may consist of different composition and are frequently the subjects of forensic examinations. This study emphasizes on blue ballpoint pen inks available in the market because it is reported that approximately 80% of questioned documents analysis involving ballpoint pen ink. Analytical techniques such as thin layer chromatography, high-performance liquid chromatography, UV-vis spectroscopy, luminescence spectroscopy and infrared spectroscopy have been used in the analysis of ink samples. In this study, application of Diamond Attenuated Total Reflectance (ATR) FTIR is straightforward but preferable in forensic science as it offers no sample preparation and minimal analysis time. The data obtained from these techniques were further analyzed using multivariate chemometric methods which enable extraction of more information based on the similarities and differences among samples in a dataset. It was indicated that some pens from the same manufactures can be similar in composition, however, discrete types can be significantly different.Keywords: ATR FTIR, ballpoint, multivariate chemometric, PCA
Procedia PDF Downloads 4575913 Electromagnetically-Vibrated Solid-Phase Microextraction for Organic Compounds
Authors: Soo Hyung Park, Seong Beom Kim, Wontae Lee, Jin Chul Joo, Jungmin Lee, Jongsoo Choi
Abstract:
A newly-developed electromagnetically vibrated solid-phase microextraction (SPME) device for extracting nonpolar organic compounds from aqueous matrices was evaluated in terms of sorption equilibrium time, precision, and detection level relative to three other more conventional extraction techniques involving SPME, viz., static, magnetic stirring, and fiber insertion/retraction. Electromagnetic vibration at 300~420 cycles/s was found to be the most efficient extraction technique in terms of reducing sorption equilibrium time and enhancing both precision and linearity. The increased efficiency for electromagnetic vibration was attributed to a greater reduction in the thickness of the stagnant-water layer that facilitated more rapid mass transport from the aqueous matrix to the SPME fiber. Electromagnetic vibration less than 500 cycles/s also did not detrimentally impact the sustainability of the extracting performance of the SPME fiber. Therefore, electromagnetically vibrated SPME may be a more powerful tool for rapid sampling and solvent-free sample preparation relative to other more conventional extraction techniques used with SPME.Keywords: electromagnetic vibration, organic compounds, precision, solid-phase microextraction (SPME), sorption equilibrium time
Procedia PDF Downloads 2545912 Mental Health Diagnosis through Machine Learning Approaches
Authors: Md Rafiqul Islam, Ashir Ahmed, Anwaar Ulhaq, Abu Raihan M. Kamal, Yuan Miao, Hua Wang
Abstract:
Mental health of people is equally important as of their physical health. Mental health and well-being are influenced not only by individual attributes but also by the social circumstances in which people find themselves and the environment in which they live. Like physical health, there is a number of internal and external factors such as biological, social and occupational factors that could influence the mental health of people. People living in poverty, suffering from chronic health conditions, minority groups, and those who exposed to/or displaced by war or conflict are generally more likely to develop mental health conditions. However, to authors’ best knowledge, there is dearth of knowledge on the impact of workplace (especially the highly stressed IT/Tech workplace) on the mental health of its workers. This study attempts to examine the factors influencing the mental health of tech workers. A publicly available dataset containing more than 65,000 cells and 100 attributes is examined for this purpose. Number of machine learning techniques such as ‘Decision Tree’, ‘K nearest neighbor’ ‘Support Vector Machine’ and ‘Ensemble’, are then applied to the selected dataset to draw the findings. It is anticipated that the analysis reported in this study would contribute in presenting useful insights on the attributes contributing in the mental health of tech workers using relevant machine learning techniques.Keywords: mental disorder, diagnosis, occupational stress, IT workplace
Procedia PDF Downloads 288