Search results for: intelligent techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7413

Search results for: intelligent techniques

6153 A Review on Existing Challenges of Data Mining and Future Research Perspectives

Authors: Hema Bhardwaj, D. Srinivasa Rao

Abstract:

Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.

Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges

Procedia PDF Downloads 110
6152 Performance Evaluation of Production Schedules Based on Process Mining

Authors: Kwan Hee Han

Abstract:

External environment of enterprise is rapidly changing majorly by global competition, cost reduction pressures, and new technology. In these situations, production scheduling function plays a critical role to meet customer requirements and to attain the goal of operational efficiency. It deals with short-term decision making in the production process of the whole supply chain. The major task of production scheduling is to seek a balance between customer orders and limited resources. In manufacturing companies, this task is so difficult because it should efficiently utilize resource capacity under the careful consideration of many interacting constraints. At present, many computerized software solutions have been utilized in many enterprises to generate a realistic production schedule to overcome the complexity of schedule generation. However, most production scheduling systems do not provide sufficient information about the validity of the generated schedule except limited statistics. Process mining only recently emerged as a sub-discipline of both data mining and business process management. Process mining techniques enable the useful analysis of a wide variety of processes such as process discovery, conformance checking, and bottleneck analysis. In this study, the performance of generated production schedule is evaluated by mining event log data of production scheduling software system by using the process mining techniques since every software system generates event logs for the further use such as security investigation, auditing and error bugging. An application of process mining approach is proposed for the validation of the goodness of production schedule generated by scheduling software systems in this study. By using process mining techniques, major evaluation criteria such as utilization of workstation, existence of bottleneck workstations, critical process route patterns, and work load balance of each machine over time are measured, and finally, the goodness of production schedule is evaluated. By using the proposed process mining approach for evaluating the performance of generated production schedule, the quality of production schedule of manufacturing enterprises can be improved.

Keywords: data mining, event log, process mining, production scheduling

Procedia PDF Downloads 279
6151 Reproduction of New Media Art Village around NTUT: Heterotopia of Visual Culture Art Education

Authors: Yu Cheng-Yu

Abstract:

‘Heterotopia’, ‘Visual Cultural Art Education’ and ‘New Media’ of these three subjects seemingly are irrelevant. In fact, there are synchronicity and intertextuality inside. In addition to visual culture, art education inspires students the ability to reflect on popular culture image through visual culture teaching strategies in school. We should get involved in the community to construct the learning environment that conveys visual culture art. This thesis attempts to probe the heterogeneity of space and value from Michel Foucault and to research sustainable development strategy in ‘New Media Art Village’ heterogeneity from Jean Baudrillard, Marshall McLuhan's media culture theory and social construction ideology. It is possible to find a new media group that can convey ‘Visual Culture Art Education’ around the National Taipei University of Technology in this commercial district that combines intelligent technology, fashion, media, entertainment, art education, and marketing network. Let the imagination and innovation of ‘New Media Art Village’ become ‘implementable’ and new media Heterotopia of inter-subjectivity with the engagement of big data and digital media. Visual culture art education will also bring aesthetics into the community by New Media Art Village.

Keywords: social construction, heterogeneity, new media, big data, visual culture art education

Procedia PDF Downloads 248
6150 Improving the Security of Internet of Things Using Encryption Algorithms

Authors: Amirhossein Safi

Abstract:

Internet of things (IOT) is a kind of advanced information technology which has drawn societies’ attention. Sensors and stimulators are usually recognized as smart devices of our environment. Simultaneously, IOT security brings up new issues. Internet connection and possibility of interaction with smart devices cause those devices to involve more in human life. Therefore, safety is a fundamental requirement in designing IOT. IOT has three remarkable features: overall perception, reliable transmission, and intelligent processing. Because of IOT span, security of conveying data is an essential factor for system security. Hybrid encryption technique is a new model that can be used in IOT. This type of encryption generates strong security and low computation. In this paper, we have proposed a hybrid encryption algorithm which has been conducted in order to reduce safety risks and enhancing encryption's speed and less computational complexity. The purpose of this hybrid algorithm is information integrity, confidentiality, non-repudiation in data exchange for IOT. Eventually, the suggested encryption algorithm has been simulated by MATLAB software, and its speed and safety efficiency were evaluated in comparison with conventional encryption algorithm.

Keywords: internet of things, security, hybrid algorithm, privacy

Procedia PDF Downloads 467
6149 Analysing Techniques for Fusing Multimodal Data in Predictive Scenarios Using Convolutional Neural Networks

Authors: Philipp Ruf, Massiwa Chabbi, Christoph Reich, Djaffar Ould-Abdeslam

Abstract:

In recent years, convolutional neural networks (CNN) have demonstrated high performance in image analysis, but oftentimes, there is only structured data available regarding a specific problem. By interpreting structured data as images, CNNs can effectively learn and extract valuable insights from tabular data, leading to improved predictive accuracy and uncovering hidden patterns that may not be apparent in traditional structured data analysis. In applying a single neural network for analyzing multimodal data, e.g., both structured and unstructured information, significant advantages in terms of time complexity and energy efficiency can be achieved. Converting structured data into images and merging them with existing visual material offers a promising solution for applying CNN in multimodal datasets, as they often occur in a medical context. By employing suitable preprocessing techniques, structured data is transformed into image representations, where the respective features are expressed as different formations of colors and shapes. In an additional step, these representations are fused with existing images to incorporate both types of information. This final image is finally analyzed using a CNN.

Keywords: CNN, image processing, tabular data, mixed dataset, data transformation, multimodal fusion

Procedia PDF Downloads 123
6148 Normal and Peaberry Coffee Beans Classification from Green Coffee Bean Images Using Convolutional Neural Networks and Support Vector Machine

Authors: Hira Lal Gope, Hidekazu Fukai

Abstract:

The aim of this study is to develop a system which can identify and sort peaberries automatically at low cost for coffee producers in developing countries. In this paper, the focus is on the classification of peaberries and normal coffee beans using image processing and machine learning techniques. The peaberry is not bad and not a normal bean. The peaberry is born in an only single seed, relatively round seed from a coffee cherry instead of the usual flat-sided pair of beans. It has another value and flavor. To make the taste of the coffee better, it is necessary to separate the peaberry and normal bean before green coffee beans roasting. Otherwise, the taste of total beans will be mixed, and it will be bad. In roaster procedure time, all the beans shape, size, and weight must be unique; otherwise, the larger bean will take more time for roasting inside. The peaberry has a different size and different shape even though they have the same weight as normal beans. The peaberry roasts slower than other normal beans. Therefore, neither technique provides a good option to select the peaberries. Defect beans, e.g., sour, broken, black, and fade bean, are easy to check and pick up manually by hand. On the other hand, the peaberry pick up is very difficult even for trained specialists because the shape and color of the peaberry are similar to normal beans. In this study, we use image processing and machine learning techniques to discriminate the normal and peaberry bean as a part of the sorting system. As the first step, we applied Deep Convolutional Neural Networks (CNN) and Support Vector Machine (SVM) as machine learning techniques to discriminate the peaberry and normal bean. As a result, better performance was obtained with CNN than with SVM for the discrimination of the peaberry. The trained artificial neural network with high performance CPU and GPU in this work will be simply installed into the inexpensive and low in calculation Raspberry Pi system. We assume that this system will be used in under developed countries. The study evaluates and compares the feasibility of the methods in terms of accuracy of classification and processing speed.

Keywords: convolutional neural networks, coffee bean, peaberry, sorting, support vector machine

Procedia PDF Downloads 144
6147 Study of Two MPPTs for Photovoltaic Systems Using Controllers Based in Fuzzy Logic and Sliding Mode

Authors: N. Ould cherchali, M. S. Boucherit, L. Barazane, A. Morsli

Abstract:

Photovoltaic power is widely used to supply isolated or unpopulated areas (lighting, pumping, etc.). Great advantage is that this source is inexhaustible, it offers great safety in use and it is clean. But the dynamic models used to describe a photovoltaic system are complicated and nonlinear and due to nonlinear I-V and P–V characteristics of photovoltaic generators, a maximum power point tracking technique (MPPT) is required to maximize the output power. In this paper, two online techniques of maximum power point tracking using robust controller for photovoltaic systems are proposed, the first technique use fuzzy logic controller (FLC) and the second use sliding mode controller (SMC) for photovoltaic systems. The two maximum power point tracking controllers receive the partial derivative of power as inputs, and the output is the duty cycle corresponding to maximum power. A Photovoltaic generator with Boost converter is developed using MATLAB/Simulink to verify the preferences of the proposed techniques. SMC technique provides a good tracking speed in fast changing irradiation and when the irradiation changes slowly or is constant the panel power of FLC technique presents a much smoother signal with less fluctuations.

Keywords: fuzzy logic controller, maximum power point, photovoltaic system, tracker, sliding mode controller

Procedia PDF Downloads 547
6146 Comparing Image Processing and AI Techniques for Disease Detection in Plants

Authors: Luiz Daniel Garay Trindade, Antonio De Freitas Valle Neto, Fabio Paulo Basso, Elder De Macedo Rodrigues, Maicon Bernardino, Daniel Welfer, Daniel Muller

Abstract:

Agriculture plays an important role in society since it is one of the main sources of food in the world. To help the production and yield of crops, precision agriculture makes use of technologies aiming at improving productivity and quality of agricultural commodities. One of the problems hampering quality of agricultural production is the disease affecting crops. Failure in detecting diseases in a short period of time can result in small or big damages to production, causing financial losses to farmers. In order to provide a map of the contributions destined to the early detection of plant diseases and a comparison of the accuracy of the selected studies, a systematic literature review of the literature was performed, showing techniques for digital image processing and neural networks. We found 35 interesting tool support alternatives to detect disease in 19 plants. Our comparison of these studies resulted in an overall average accuracy of 87.45%, with two studies very closer to obtain 100%.

Keywords: pattern recognition, image processing, deep learning, precision agriculture, smart farming, agricultural automation

Procedia PDF Downloads 379
6145 A NoSQL Based Approach for Real-Time Managing of Robotics's Data

Authors: Gueidi Afef, Gharsellaoui Hamza, Ben Ahmed Samir

Abstract:

This paper deals with the secret of the continual progression data that new data management solutions have been emerged: The NoSQL databases. They crossed several areas like personalization, profile management, big data in real-time, content management, catalog, view of customers, mobile applications, internet of things, digital communication and fraud detection. Nowadays, these database management systems are increasing. These systems store data very well and with the trend of big data, a new challenge’s store demands new structures and methods for managing enterprise data. The new intelligent machine in the e-learning sector, thrives on more data, so smart machines can learn more and faster. The robotics are our use case to focus on our test. The implementation of NoSQL for Robotics wrestle all the data they acquire into usable form because with the ordinary type of robotics; we are facing very big limits to manage and find the exact information in real-time. Our original proposed approach was demonstrated by experimental studies and running example used as a use case.

Keywords: NoSQL databases, database management systems, robotics, big data

Procedia PDF Downloads 354
6144 Process for Separating and Recovering Materials from Kerf Slurry Waste

Authors: Tarik Ouslimane, Abdenour Lami, Salaheddine Aoudj, Mouna Hecini, Ouahiba Bouchelaghem, Nadjib Drouiche

Abstract:

Slurry waste is a byproduct generated from the slicing process of multi-crystalline silicon ingots. This waste can be used as a secondary resource to recover high purity silicon which has a great economic value. From the management perspective, the ever increasing generation of kerf slurry waste loss leads to significant challenges for the photovoltaic industry due to the current low use of slurry waste for silicon recovery. Slurry waste, in most cases, contains silicon, silicon carbide, metal fragments and mineral-oil-based or glycol-based slurry vehicle. As a result, of the global scarcity of high purity silicon supply, the high purity silicon content in slurry has increasingly attracted interest for research. This paper presents a critical overview of the current techniques employed for high purity silicon recovery from kerf slurry waste. Hydrometallurgy is continuously a matter of study and research. However, in this review paper, several new techniques about the process of high purity silicon recovery from slurry waste are introduced. The purpose of the information presented is to improve the development of a clean and effective recovery process of high purity silicon from slurry waste.

Keywords: Kerf-loss, slurry waste, silicon carbide, silicon recovery, photovoltaic, high purity silicon, polyethylen glycol

Procedia PDF Downloads 310
6143 A Cloud Computing System Using Virtual Hyperbolic Coordinates for Services Distribution

Authors: Telesphore Tiendrebeogo, Oumarou Sié

Abstract:

Cloud computing technologies have attracted considerable interest in recent years. Thus, these latters have become more important for many existing database applications. It provides a new mode of use and of offer of IT resources in general. Such resources can be used “on demand” by anybody who has access to the internet. Particularly, the Cloud platform provides an ease to use interface between providers and users, allow providers to develop and provide software and databases for users over locations. Currently, there are many Cloud platform providers support large scale database services. However, most of these only support simple keyword-based queries and can’t response complex query efficiently due to lack of efficient in multi-attribute index techniques. Existing Cloud platform providers seek to improve performance of indexing techniques for complex queries. In this paper, we define a new cloud computing architecture based on a Distributed Hash Table (DHT) and design a prototype system. Next, we perform and evaluate our cloud computing indexing structure based on a hyperbolic tree using virtual coordinates taken in the hyperbolic plane. We show through our experimental results that we compare with others clouds systems to show our solution ensures consistence and scalability for Cloud platform.

Keywords: virtual coordinates, cloud, hyperbolic plane, storage, scalability, consistency

Procedia PDF Downloads 425
6142 Control Flow around NACA 4415 Airfoil Using Slot and Injection

Authors: Imine Zakaria, Meftah Sidi Mohamed El Amine

Abstract:

One of the most vital aerodynamic organs of a flying machine is the wing, which allows it to fly in the air efficiently. The flow around the wing is very sensitive to changes in the angle of attack. Beyond a value, there is a phenomenon of the boundary layer separation on the upper surface, which causes instability and total degradation of aerodynamic performance called a stall. However, controlling flow around an airfoil has become a researcher concern in the aeronautics field. There are two techniques for controlling flow around a wing to improve its aerodynamic performance: passive and active controls. Blowing and suction are among the active techniques that control the boundary layer separation around an airfoil. Their objective is to give energy to the air particles in the boundary layer separation zones and to create vortex structures that will homogenize the velocity near the wall and allow control. Blowing and suction have long been used as flow control actuators around obstacles. In 1904 Prandtl applied a permanent blowing to a cylinder to delay the boundary layer separation. In the present study, several numerical investigations have been developed to predict a turbulent flow around an aerodynamic profile. CFD code was used for several angles of attack in order to validate the present work with that of the literature in the case of a clean profile. The variation of the lift coefficient CL with the momentum coefficient

Keywords: CFD, control flow, lift, slot

Procedia PDF Downloads 197
6141 Data Science-Based Key Factor Analysis and Risk Prediction of Diabetic

Authors: Fei Gao, Rodolfo C. Raga Jr.

Abstract:

This research proposal will ascertain the major risk factors for diabetes and to design a predictive model for risk assessment. The project aims to improve diabetes early detection and management by utilizing data science techniques, which may improve patient outcomes and healthcare efficiency. The phase relation values of each attribute were used to analyze and choose the attributes that might influence the examiner's survival probability using Diabetes Health Indicators Dataset from Kaggle’s data as the research data. We compare and evaluate eight machine learning algorithms. Our investigation begins with comprehensive data preprocessing, including feature engineering and dimensionality reduction, aimed at enhancing data quality. The dataset, comprising health indicators and medical data, serves as a foundation for training and testing these algorithms. A rigorous cross-validation process is applied, and we assess their performance using five key metrics like accuracy, precision, recall, F1-score, and area under the receiver operating characteristic curve (AUC-ROC). After analyzing the data characteristics, investigate their impact on the likelihood of diabetes and develop corresponding risk indicators.

Keywords: diabetes, risk factors, predictive model, risk assessment, data science techniques, early detection, data analysis, Kaggle

Procedia PDF Downloads 75
6140 Enhancement Dynamic Cars Detection Based on Optimized HOG Descriptor

Authors: Mansouri Nabila, Ben Jemaa Yousra, Motamed Cina, Watelain Eric

Abstract:

Research and development efforts in intelligent Advanced Driver Assistance Systems (ADAS) seek to save lives and reduce the number of on-road fatalities. For traffic and emergency monitoring, the essential but challenging task is vehicle detection and tracking in reasonably short time. This purpose needs first of all a powerful dynamic car detector model. In fact, this paper presents an optimized HOG process based on shape and motion parameters fusion. Our proposed approach mains to compute HOG by bloc feature from foreground blobs using configurable research window and pathway in order to overcome the shortcoming in term of computing time of HOG descriptor and improve their dynamic application performance. Indeed we prove in this paper that HOG by bloc descriptor combined with motion parameters is a very suitable car detector which reaches in record time a satisfactory recognition rate in dynamic outside area and bypasses several popular works without using sophisticated and expensive architectures such as GPU and FPGA.

Keywords: car-detector, HOG, motion, computing time

Procedia PDF Downloads 323
6139 Loss Minimization by Distributed Generation Allocation in Radial Distribution System Using Crow Search Algorithm

Authors: M. Nageswara Rao, V. S. N. K. Chaitanya, K. Amarendranath

Abstract:

This paper presents an optimal allocation and sizing of Distributed Generation (DG) in Radial Distribution Network (RDN) for total power loss minimization and enhances the voltage profile of the system. The two main important part of this study first is to find optimal allocation and second is optimum size of DG. The locations of DGs are identified by Analytical expressions and crow search algorithm has been employed to determine the optimum size of DG. In this study, the DG has been placed on single and multiple allocations.CSA is a meta-heuristic algorithm inspired by the intelligent behavior of the crows. Crows stores their excess food in different locations and memorizes those locations to retrieve it when it is needed. They follow each other to do thievery to obtain better food source. This analysis is tested on IEEE 33 bus and IEEE 69 bus under MATLAB environment and the results are compared with existing methods.

Keywords: analytical expression, distributed generation, crow search algorithm, power loss, voltage profile

Procedia PDF Downloads 235
6138 A Nucleic Acid Extraction Method for High-Viscosity Floricultural Samples

Authors: Harunori Kawabe, Hideyuki Aoshima, Koji Murakami, Minoru Kawakami, Yuka Nakano, David D. Ordinario, C. W. Crawford, Iri Sato-Baran

Abstract:

With the recent advances in gene editing technologies allowing the rewriting of genetic sequences, additional market growth in the global floriculture market beyond previous trends is anticipated through increasingly sophisticated plant breeding techniques. As a prerequisite for gene editing, the gene sequence of the target plant must first be identified. This necessitates the genetic analysis of plants with unknown gene sequences, the extraction of RNA, and comprehensive expression analysis. Consequently, a technology capable of consistently and effectively extracting high-purity DNA and RNA from plants is of paramount importance. Although model plants, such as Arabidopsis and tobacco, have established methods for DNA and RNA extraction, floricultural species such as roses present unique challenges. Different techniques to extract DNA and RNA from various floricultural species were investigated. Upon sampling and grinding the petals of several floricultural species, it was observed that nucleic acid extraction from the ground petal solutions of low viscosity was straightforward; solutions of high viscosity presented a significant challenge. It is postulated that the presence of substantial quantities of polysaccharides and polyphenols in the plant tissue was responsible for the inhibition of nucleic acid extraction. Consequently, attempts were made to extract high-purity DNA and RNA by improving the CTAB method and combining it with commercially available nucleic acid extraction kits. The quality of the total extracted DNA and RNA was evaluated using standard methods. Finally, the effectiveness of the extraction method was assessed by determining whether it was possible to create a library that could be applied as a suitable template for a next-generation sequencer. In conclusion, a method was developed for consistent and accurate nucleic acid extraction from high-viscosity floricultural samples. These results demonstrate improved techniques for DNA and RNA extraction from flowers, help facilitate gene editing of floricultural species and expand the boundaries of research and commercial opportunities.

Keywords: floriculture, gene editing, next-generation sequencing, nucleic acid extraction

Procedia PDF Downloads 29
6137 Machine Vision System for Measuring the Quality of Bulk Sun-dried Organic Raisins

Authors: Navab Karimi, Tohid Alizadeh

Abstract:

An intelligent vision-based system was designed to measure the quality and purity of raisins. A machine vision setup was utilized to capture the images of bulk raisins in ranges of 5-50% mixed pure-impure berries. The textural features of bulk raisins were extracted using Grey-level Histograms, Co-occurrence Matrix, and Local Binary Pattern (a total of 108 features). Genetic Algorithm and neural network regression were used for selecting and ranking the best features (21 features). As a result, the GLCM features set was found to have the highest accuracy (92.4%) among the other sets. Followingly, multiple feature combinations of the previous stage were fed into the second regression (linear regression) to increase accuracy, wherein a combination of 16 features was found to be the optimum. Finally, a Support Vector Machine (SVM) classifier was used to differentiate the mixtures, producing the best efficiency and accuracy of 96.2% and 97.35%, respectively.

Keywords: sun-dried organic raisin, genetic algorithm, feature extraction, ann regression, linear regression, support vector machine, south azerbaijan.

Procedia PDF Downloads 73
6136 Multiobjective Optimization of a Pharmaceutical Formulation Using Regression Method

Authors: J. Satya Eswari, Ch. Venkateswarlu

Abstract:

The formulation of a commercial pharmaceutical product involves several composition factors and response characteristics. When the formulation requires to satisfy multiple response characteristics which are conflicting, an optimal solution requires the need for an efficient multiobjective optimization technique. In this work, a regression is combined with a non-dominated sorting differential evolution (NSDE) involving Naïve & Slow and ε constraint techniques to derive different multiobjective optimization strategies, which are then evaluated by means of a trapidil pharmaceutical formulation. The analysis of the results show the effectiveness of the strategy that combines the regression model and NSDE with the integration of both Naïve & Slow and ε constraint techniques for Pareto optimization of trapidil formulation. With this strategy, the optimal formulation at pH=6.8 is obtained with the decision variables of micro crystalline cellulose, hydroxypropyl methylcellulose and compression pressure. The corresponding response characteristics of rate constant and release order are also noted down. The comparison of these results with the experimental data and with those of other multiple regression model based multiobjective evolutionary optimization strategies signify the better performance for optimal trapidil formulation.

Keywords: pharmaceutical formulation, multiple regression model, response surface method, radial basis function network, differential evolution, multiobjective optimization

Procedia PDF Downloads 409
6135 Synthesis and Characterization of Functionalized Carbon Nanorods/Polystyrene Nanocomposites

Authors: M. A. Karakassides, M. Baikousi, A. Kouloumpis, D. Gournis

Abstract:

Nanocomposites of Carbon Nanorods (CNRs) with Polystyrene (PS), have been synthesized successfully by means of in situ polymerization process and characterized. Firstly, carbon nanorods with graphitic structure were prepared by the standard synthetic procedure of CMK-3 using MCM-41 as template, instead of SBA-15, and sucrose as carbon source. In order to create an organophilic surface on CNRs, two parts of modification were realized: surface chemical oxidation (CNRs-ox) according to the Staudenmaier’s method and the attachment of octadecylamine molecules on the functional groups of CNRs-ox (CNRs-ODA The nanocomposite materials of polystyrene with CNRs-ODA, were prepared by a solution-precipitation method at three nanoadditive to polymer loadings (1, 3 and 5 wt. %). The as derived nanocomposites were studied with a combination of characterization and analytical techniques. Especially, Fourier-transform infrared (FT-IR) and Raman spectroscopies were used for the chemical and structural characterization of the pristine materials and the derived nanocomposites while the morphology of nanocomposites and the dispersion of the carbon nanorods were analyzed by atomic force and scanning electron microscopy techniques. Tensile testing and thermogravimetric analysis (TGA) along with differential scanning calorimetry (DSC) were also used to examine the mechanical properties and thermal stability -glass transition temperature of PS after the incorporation of CNRs-ODA nanorods. The results showed that the thermal and mechanical properties of the PS/ CNRs-ODA nanocomposites gradually improved with increasing of CNRs-ODA loading.

Keywords: nanocomposites, polystyrene, carbon, nanorods

Procedia PDF Downloads 352
6134 Diagnosis of Rotavirus Infection among Egyptian Children by Using Different Laboratory Techniques

Authors: Mohamed A. Alhammad, Hadia A. Abou-Donia, Mona H. Hashish, Mohamed N. Massoud

Abstract:

Background: Rotavirus is the leading etiologic agent of severe diarrheal disease in infants and young children worldwide. The present study was aimed 1) to detect rotavirus infection as a cause of diarrhoea among children under 5 years of age using the two serological methods (ELISA and LA) and the PCR technique (2) to evaluate the three methodologies used for human RV detection in stool samples. Materials and Methods: This study was carried out on 247 children less than 5 years old, diagnosed clinically as acute gastroenteritis and attending Alexandria University Children Hospital at EL-Shatby. Rotavirus antigen was screened by ELISA and LA tests in all stool samples, whereas only 100 samples were subjected to RT-PCR method for detection of rotavirus RNA. Results: Out of the 247 studied cases with diarrhoea, rotavirus antigen was detected in 83 (33.6%) by ELISA and 73 (29.6%) by LA, while the 100 cases tested by RT-PCR showed that 44% of them had rotavirus RNA. Rotavirus diarrhoea was significantly presented with a marked seasonal peak during autumn and winter (61.4%). Conclusion: The present study confirms the huge burden of rotavirus as a major cause of acute diarrhoea in Egyptian infants and young children. It was concluded that; LA is equal in sensitivity to ELISA, ELISA is more specific than LA, and RT-PCR is more specific than ELISA and LA in diagnosis of rotavirus infection.

Keywords: rotavirus, diarrhea, immunoenzyme techniques, latex fixation tests, RT-PCR

Procedia PDF Downloads 370
6133 A Comprehensive Survey on Machine Learning Techniques and User Authentication Approaches for Credit Card Fraud Detection

Authors: Niloofar Yousefi, Marie Alaghband, Ivan Garibay

Abstract:

With the increase of credit card usage, the volume of credit card misuse also has significantly increased, which may cause appreciable financial losses for both credit card holders and financial organizations issuing credit cards. As a result, financial organizations are working hard on developing and deploying credit card fraud detection methods, in order to adapt to ever-evolving, increasingly sophisticated defrauding strategies and identifying illicit transactions as quickly as possible to protect themselves and their customers. Compounding on the complex nature of such adverse strategies, credit card fraudulent activities are rare events compared to the number of legitimate transactions. Hence, the challenge to develop fraud detection that are accurate and efficient is substantially intensified and, as a consequence, credit card fraud detection has lately become a very active area of research. In this work, we provide a survey of current techniques most relevant to the problem of credit card fraud detection. We carry out our survey in two main parts. In the first part, we focus on studies utilizing classical machine learning models, which mostly employ traditional transnational features to make fraud predictions. These models typically rely on some static physical characteristics, such as what the user knows (knowledge-based method), or what he/she has access to (object-based method). In the second part of our survey, we review more advanced techniques of user authentication, which use behavioral biometrics to identify an individual based on his/her unique behavior while he/she is interacting with his/her electronic devices. These approaches rely on how people behave (instead of what they do), which cannot be easily forged. By providing an overview of current approaches and the results reported in the literature, this survey aims to drive the future research agenda for the community in order to develop more accurate, reliable and scalable models of credit card fraud detection.

Keywords: Credit Card Fraud Detection, User Authentication, Behavioral Biometrics, Machine Learning, Literature Survey

Procedia PDF Downloads 121
6132 Digital Twin Platform for BDS-3 Satellite Navigation Using Digital Twin Intelligent Visualization Technology

Authors: Rundong Li, Peng Wu, Junfeng Zhang, Zhipeng Ren, Chen Yang, Jiahui Gan, Lu Feng, Haibo Tong, Xuemei Xiao, Yuying Chen

Abstract:

The research of Beidou-3 satellite navigation is on the rise, but in actual work, it is inevitable that satellite data is insecure, research and development is inefficient, and there is no ability to deal with failures in advance. Digital twin technology has obvious advantages in the simulation of life cycle models of aerospace satellite navigation products. In order to meet the increasing demand, this paper builds a Beidou-3 satellite navigation digital twin platform (BDSDTP). The basic establishment of BDSDTP was completed by establishing a digital twin double, Beidou-3 comprehensive digital twin design, predictive maintenance (PdM) mathematical model, and visual interaction design. Finally, this paper provides a time application case of the platform, which provides a reference for the application of BDSDTP in various fields of navigation and provides obvious help for extending the full cycle life of Beidou-3 satellite navigation.

Keywords: BDS-3, digital twin, visualization, PdM

Procedia PDF Downloads 142
6131 Specified Human Motion Recognition and Unknown Hand-Held Object Tracking

Authors: Jinsiang Shaw, Pik-Hoe Chen

Abstract:

This paper aims to integrate human recognition, motion recognition, and object tracking technologies without requiring a pre-training database model for motion recognition or the unknown object itself. Furthermore, it can simultaneously track multiple users and multiple objects. Unlike other existing human motion recognition methods, our approach employs a rule-based condition method to determine if a user hand is approaching or departing an object. It uses a background subtraction method to separate the human and object from the background, and employs behavior features to effectively interpret human object-grabbing actions. With an object’s histogram characteristics, we are able to isolate and track it using back projection. Hence, a moving object trajectory can be recorded and the object itself can be located. This particular technique can be used in a camera surveillance system in a shopping area to perform real-time intelligent surveillance, thus preventing theft. Experimental results verify the validity of the developed surveillance algorithm with an accuracy of 83% for shoplifting detection.

Keywords: Automatic Tracking, Back Projection, Motion Recognition, Shoplifting

Procedia PDF Downloads 333
6130 Bacterial Flora of the Anopheles Fluviatilis S. L. in an Endemic Malaria Area in Southeastern Iran for Candidate Paraterasgenesis Strains

Authors: Seyed Hassan Moosa-kazemi, Jalal Mohammadi Soleimani, Hassan Vatandoost, Mohammad Hassan Shirazi, Sara Hajikhani, Roonak Bakhtiari, Morteza Akbari, Siamak Hydarzadeh

Abstract:

Malaria is an infectious disease and considered most important health problems in the southeast of Iran. Iran is elimination malaria phase and new tool need to vector control. Paraterasgenesis is a new way to cut of life cycle of the malaria parasite. In this study, the microflora of the surface and gut of various stages of Anopheles fluviatilis James as one of the important malaria vector was studied using biochemical and molecular techniques during 2013-2014. Twelve bacteria species were found including; Providencia rettgeri, Morganella morganii, Enterobacter aerogenes, Pseudomonas oryzihabitans, Citrobacter braakii، Citrobacter freundii، Aeromonas hydrophila، Klebsiella oxytoca, Citrobacter koseri, Serratia fonticola، Enterobacter sakazakii and Yersinia pseudotuberculosis. The species of Alcaligenes faecalis, Providencia vermicola and Enterobacter hormaechei were identified in various stages of the vector and confirmed by biochemical and molecular techniques. We found Providencia rettgeri proper candidate for paratransgenesis.

Keywords: Anopheles fluviatilis, bacteria, malaria, Paraterasgenesis, Southern Iran

Procedia PDF Downloads 491
6129 Fiber Stiffness Detection of GFRP Using Combined ABAQUS and Genetic Algorithms

Authors: Gyu-Dong Kim, Wuk-Jae Yoo, Sang-Youl Lee

Abstract:

Composite structures offer numerous advantages over conventional structural systems in the form of higher specific stiffness and strength, lower life-cycle costs, and benefits such as easy installation and improved safety. Recently, there has been a considerable increase in the use of composites in engineering applications and as wraps for seismic upgrading and repairs. However, these composites deteriorate with time because of outdated materials, excessive use, repetitive loading, climatic conditions, manufacturing errors, and deficiencies in inspection methods. In particular, damaged fibers in a composite result in significant degradation of structural performance. In order to reduce the failure probability of composites in service, techniques to assess the condition of the composites to prevent continual growth of fiber damage are required. Condition assessment technology and nondestructive evaluation (NDE) techniques have provided various solutions for the safety of structures by means of detecting damage or defects from static or dynamic responses induced by external loading. A variety of techniques based on detecting the changes in static or dynamic behavior of isotropic structures has been developed in the last two decades. These methods, based on analytical approaches, are limited in their capabilities in dealing with complex systems, primarily because of their limitations in handling different loading and boundary conditions. Recently, investigators have introduced direct search methods based on metaheuristics techniques and artificial intelligence, such as genetic algorithms (GA), simulated annealing (SA) methods, and neural networks (NN), and have promisingly applied these methods to the field of structural identification. Among them, GAs attract our attention because they do not require a considerable amount of data in advance in dealing with complex problems and can make a global solution search possible as opposed to classical gradient-based optimization techniques. In this study, we propose an alternative damage-detection technique that can determine the degraded stiffness distribution of vibrating laminated composites made of Glass Fiber-reinforced Polymer (GFRP). The proposed method uses a modified form of the bivariate Gaussian distribution function to detect degraded stiffness characteristics. In addition, this study presents a method to detect the fiber property variation of laminated composite plates from the micromechanical point of view. The finite element model is used to study free vibrations of laminated composite plates for fiber stiffness degradation. In order to solve the inverse problem using the combined method, this study uses only first mode shapes in a structure for the measured frequency data. In particular, this study focuses on the effect of the interaction among various parameters, such as fiber angles, layup sequences, and damage distributions, on fiber-stiffness damage detection.

Keywords: stiffness detection, fiber damage, genetic algorithm, layup sequences

Procedia PDF Downloads 272
6128 Big Data-Driven Smart Policing: Big Data-Based Patrol Car Dispatching in Abu Dhabi, UAE

Authors: Oualid Walid Ben Ali

Abstract:

Big Data has become one of the buzzwords today. The recent explosion of digital data has led the organization, either private or public, to a new era towards a more efficient decision making. At some point, business decided to use that concept in order to learn what make their clients tick with phrases like ‘sales funnel’ analysis, ‘actionable insights’, and ‘positive business impact’. So, it stands to reason that Big Data was viewed through green (read: money) colored lenses. Somewhere along the line, however someone realized that collecting and processing data doesn’t have to be for business purpose only, but also could be used for other purposes to assist law enforcement or to improve policing or in road safety. This paper presents briefly, how Big Data have been used in the fields of policing order to improve the decision making process in the daily operation of the police. As example, we present a big-data driven system which is sued to accurately dispatch the patrol cars in a geographic environment. The system is also used to allocate, in real-time, the nearest patrol car to the location of an incident. This system has been implemented and applied in the Emirate of Abu Dhabi in the UAE.

Keywords: big data, big data analytics, patrol car allocation, dispatching, GIS, intelligent, Abu Dhabi, police, UAE

Procedia PDF Downloads 490
6127 Gamification Using Stochastic Processes: Engage Children to Have Healthy Habits

Authors: Andre M. Carvalho, Pedro Sebastiao

Abstract:

This article is based on a dissertation that intends to analyze and make a model, intelligently, algorithms based on stochastic processes of a gamification application applied to marketing. Gamification is used in our daily lives to engage us to perform certain actions in order to achieve goals and gain rewards. This strategy is an increasingly adopted way to encourage and retain customers through game elements. The application of gamification aims to encourage children between 6 and 10 years of age to have healthy habits and the purpose of serving as a model for use in marketing. This application was developed in unity; we implemented intelligent algorithms based on stochastic processes, web services to respond to all requests of the application, a back-office website to manage the application and the database. The behavioral analysis of the use of game elements and stochastic processes in children’s motivation was done. The application of algorithms based on stochastic processes in-game elements is very important to promote cooperation and to ensure fair and friendly competition between users which consequently stimulates the user’s interest and their involvement in the application and organization.

Keywords: engage, games, gamification, randomness, stochastic processes

Procedia PDF Downloads 330
6126 Competition between Regression Technique and Statistical Learning Models for Predicting Credit Risk Management

Authors: Chokri Slim

Abstract:

The objective of this research is attempting to respond to this question: Is there a significant difference between the regression model and statistical learning models in predicting credit risk management? A Multiple Linear Regression (MLR) model was compared with neural networks including Multi-Layer Perceptron (MLP), and a Support vector regression (SVR). The population of this study includes 50 listed Banks in Tunis Stock Exchange (TSE) market from 2000 to 2016. Firstly, we show the factors that have significant effect on the quality of loan portfolios of banks in Tunisia. Secondly, it attempts to establish that the systematic use of objective techniques and methods designed to apprehend and assess risk when considering applications for granting credit, has a positive effect on the quality of loan portfolios of banks and their future collectability. Finally, we will try to show that the bank governance has an impact on the choice of methods and techniques for analyzing and measuring the risks inherent in the banking business, including the risk of non-repayment. The results of empirical tests confirm our claims.

Keywords: credit risk management, multiple linear regression, principal components analysis, artificial neural networks, support vector machines

Procedia PDF Downloads 150
6125 A Process FMEA in Aero Fuel Pump Manufacturing and Conduct the Corrective Actions

Authors: Zohre Soleymani, Meisam Amirzadeh

Abstract:

Many products are safety critical, so proactive analysis techniques are vital for them because these techniques try to identify potential failures before the products are produced. Failure Mode and Effective Analysis (FMEA) is an effective tool in identifying probable problems of product or process and prioritizing them and planning for its elimination. The paper shows the implementation of FMEA process to identify and remove potential troubles of aero fuel pumps manufacturing process and improve the reliability of subsystems. So the different possible causes of failure and its effects along with the recommended actions are discussed. FMEA uses Risk Priority Number (RPN) to determine the risk level. RPN value is depending on Severity(S), Occurrence (O) and Detection (D) parameters, so these parameters need to be determined. After calculating the RPN for identified potential failure modes, the corrective actions are defined to reduce risk level according to assessment strategy and determined acceptable risk level. Then FMEA process is performed again and RPN revised is calculated. The represented results are applied in the format of a case study. These results show the improvement in manufacturing process and considerable reduction in aero fuel pump production risk level.

Keywords: FMEA, risk priority number, aero pump, corrective action

Procedia PDF Downloads 286
6124 Competitive Intelligence within the Maritime Security Intelligence

Authors: Dicky R. Munaf, Ayu Bulan Tisna

Abstract:

Competitive intelligence (business intelligence) is the process of observing the external environment which often conducted by many organizations to get the relevant information which will be used to create the organization policy, whereas, security intelligence is related to the function of the officers who have the duties to protect the country and its people from every criminal actions that might harm the national and individual security. Therefore, the intelligence dimension of maritime security is associated with all the intelligence activities including the subject and the object that connected to the maritime issues. The concept of intelligence business regarding the maritime security perspective is the efforts to protect the maritime security using the analysis of economic movements as the basic strategic plan. Clearly, a weak maritime security will cause high operational cost to all the economic activities which uses the sea as its media. Thus, it affects the competitiveness of a country compared to the other countries that are able to maintain the maritime law enforcement and secure their marine territory. So, the intelligence business within the security intelligence is important to conduct as the beginning process of the identification against the opponent strategy that might happen in the present or in the future. Thereby, the scenario of the potential impact of all the illegal maritime activities, as well as the strategy in preventing the opponent maneuver can be made.

Keywords: competitive intelligence, maritime security intelligence, intelligent systems, information technology

Procedia PDF Downloads 500