Search results for: Neural Net Works
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1809

Search results for: Neural Net Works

249 Aspect-Level Sentiment Analysis with Multi-Channel and Graph Convolutional Networks

Authors: Jiajun Wang, Xiaoge Li

Abstract:

The purpose of the aspect-level sentiment analysis task is to identify the sentiment polarity of aspects in a sentence. Currently, most methods mainly focus on using neural networks and attention mechanisms to model the relationship between aspects and context, but they ignore the dependence of words in different ranges in the sentence, resulting in deviation when assigning relationship weight to other words other than aspect words. To solve these problems, we propose an aspect-level sentiment analysis model that combines a multi-channel convolutional network and graph convolutional network (GCN). Firstly, the context and the degree of association between words are characterized by Long Short-Term Memory (LSTM) and self-attention mechanism. Besides, a multi-channel convolutional network is used to extract the features of words in different ranges. Finally, a convolutional graph network is used to associate the node information of the dependency tree structure. We conduct experiments on four benchmark datasets. The experimental results are compared with those of other models, which shows that our model is better and more effective.

Keywords: Aspect-level sentiment analysis, attention, multi-channel convolution network, graph convolution network, dependency tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 505
248 Experimental Investigation on Residual Stresses in Welded Medium-Walled I-shaped Sections Fabricated from Q460GJ Structural Steel Plates

Authors: Qian Zhu, Shidong Nie, Bo Yang, Gang Xiong, Guoxin Dai

Abstract:

GJ steel is a new type of high-performance structural steel which has been increasingly adopted in practical engineering. Q460GJ structural steel has a nominal yield strength of 460 MPa, which does not decrease significantly with the increase of steel plate thickness like normal structural steel. Thus, Q460GJ structural steel is normally used in medium-walled welded sections. However, research works on the residual stress in GJ steel members are few though it is one of the vital factors that can affect the member and structural behavior. This article aims to investigate the residual stresses in welded I-shaped sections fabricated from Q460GJ structural steel plates by experimental tests. A total of four full scale welded medium-walled I-shaped sections were tested by sectioning method. Both circular curve correction method and straightening measurement method were adopted in this study to obtain the final magnitude and distribution of the longitudinal residual stresses. In addition, this paper also explores the interaction between flanges and webs. And based on the statistical evaluation of the experimental data, a multilayer residual stress model is proposed.

Keywords: Q460GJ structural steel, residual stresses, sectioning method, Welded medium-walled I-shaped sections.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1055
247 Evolutionary Algorithms for Learning Primitive Fuzzy Behaviors and Behavior Coordination in Multi-Objective Optimization Problems

Authors: Li Shoutao, Gordon Lee

Abstract:

Evolutionary robotics is concerned with the design of intelligent systems with life-like properties by means of simulated evolution. Approaches in evolutionary robotics can be categorized according to the control structures that represent the behavior and the parameters of the controller that undergo adaptation. The basic idea is to automatically synthesize behaviors that enable the robot to perform useful tasks in complex environments. The evolutionary algorithm searches through the space of parameterized controllers that map sensory perceptions to control actions, thus realizing a specific robotic behavior. Further, the evolutionary algorithm maintains and improves a population of candidate behaviors by means of selection, recombination and mutation. A fitness function evaluates the performance of the resulting behavior according to the robot-s task or mission. In this paper, the focus is in the use of genetic algorithms to solve a multi-objective optimization problem representing robot behaviors; in particular, the A-Compander Law is employed in selecting the weight of each objective during the optimization process. Results using an adaptive fitness function show that this approach can efficiently react to complex tasks under variable environments.

Keywords: adaptive fuzzy neural inference, evolutionary tuning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1510
246 An Approach for Reducing the Computational Complexity of LAMSTAR Intrusion Detection System using Principal Component Analysis

Authors: V. Venkatachalam, S. Selvan

Abstract:

The security of computer networks plays a strategic role in modern computer systems. Intrusion Detection Systems (IDS) act as the 'second line of defense' placed inside a protected network, looking for known or potential threats in network traffic and/or audit data recorded by hosts. We developed an Intrusion Detection System using LAMSTAR neural network to learn patterns of normal and intrusive activities, to classify observed system activities and compared the performance of LAMSTAR IDS with other classification techniques using 5 classes of KDDCup99 data. LAMSAR IDS gives better performance at the cost of high Computational complexity, Training time and Testing time, when compared to other classification techniques (Binary Tree classifier, RBF classifier, Gaussian Mixture classifier). we further reduced the Computational Complexity of LAMSTAR IDS by reducing the dimension of the data using principal component analysis which in turn reduces the training and testing time with almost the same performance.

Keywords: Binary Tree Classifier, Gaussian Mixture, IntrusionDetection System, LAMSTAR, Radial Basis Function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1746
245 A Review of Lortie’s Schoolteacher

Authors: Tsai-Hsiu Lin

Abstract:

Dan C. Lortie’s Schoolteacher: A sociological study is one of the best works on the sociology of teaching since W. Waller’s classic study. It is a book worthy of review. Following the tradition of symbolic interactionists, Lortie demonstrated the qualities who studied the occupation of teaching. Using several methods to gather effective data, Lortie has portrayed the ethos of the teaching profession. Therefore, the work is an important book on the teaching profession and teacher culture. Though outstanding, Lortie’s work is also flawed in that his perspectives and methodology were adopted largely from symbolic interactionism. First, Lortie in his work analyzed many points regarding teacher culture; for example, he was interested in exploring “sentiment,” “cathexis,” and “ethos.” Thus, he was more a psychologist than a sociologist. Second, symbolic interactionism led him to discern the teacher culture from a micro view, thereby missing the structural aspects. For example, he did not fully discuss the issue of gender and he ignored the issue of race. Finally, following the qualitative sociological tradition, Lortie employed many qualitative methods to gather data but only foucused on obtaining and presenting interview data. Moreover, he used measurement methods that were too simplistic for analyzing quantitative data fully.

Keywords: Lortie’s Schooltacher, Symbolic interactionism, teacher culture, teaching profession.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4237
244 Types of Epilepsies and Findings EEG- LORETA about Epilepsy

Authors: Leila Maleki, Ahmad Esmali Kooraneh, Hossein Taghi Derakhshi

Abstract:

Neural activity in the human brain starts from the early stages of prenatal development. This activity or signals generated by the brain are electrical in nature and represent not only the brain function but also the status of the whole body. At the present moment, three methods can record functional and physiological changes within the brain with high temporal resolution of neuronal interactions at the network level: the electroencephalogram (EEG), the magnet oencephalogram (MEG), and functional magnetic resonance imaging (fMRI); each of these has advantages and shortcomings. EEG recording with a large number of electrodes is now feasible in clinical practice. Multichannel EEG recorded from the scalp surface provides very valuable but indirect information about the source distribution. However, deep electrode measurements yield more reliable information about the source locations intracranial recordings and scalp EEG are used with the source imaging techniques to determine the locations and strengths of the epileptic activity. As a source localization method, Low Resolution Electro-Magnetic Tomography (LORETA) is solved for the realistic geometry based on both forward methods, the Boundary Element Method (BEM) and the Finite Difference Method (FDM). In this paper, we review the findings EEG- LORETA about epilepsy.

Keywords: Epilepsy, EEG, EEG- Loreta, loreta analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3094
243 The Effect of Polypropylene Fiber in the Stabilization of Expansive Soils

Authors: A. S. Soğancı

Abstract:

Expansive soils are often encountered in many parts of the world, especially in arid and semi-arid fields. Such kind of soils, generally including active clay minerals in low water content, enlarge in volume by absorbing the water through the surface and cause a great harm to the light structures such as channel coating, roads and airports. The expansive soils were encountered on the path of Apa-Hotamış conveyance channel belonging to the State Hydraulic Works in the region of Konya. In the research done in this area, it is predicted that the soil has a swollen nature and the soil should be filled with proper granular equipments by digging the ground to 50-60 cm. In this study, for purpose of helping the other research to be done in the same area, it is thought that instead of replacing swollen soil with the granular soil, by stabilizing it with polypropylene fiber and using it its original place decreases effect of swelling percent, in this way the cost will be decreased. Therefore, laboratory tests were conducted to study the effects of polypropylene fiber on swelling characteristics of expansive soil. Test results indicated that inclusion of fiber reduced swell percent of expansive soil. As the fiber content increased, the unconfined compressive strength was increased. Finally, it can be said that stabilization of expansive soils with polypropylene fiber is an effective method.

Keywords: Expansive soils, polypropylene fiber, stabilization, swelling percent.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5746
242 Influence of Fermentation Conditions on Humic Acids Production by Trichoderma viride Using an Oil Palm Empty Fruit Bunch as the Substrate

Authors: F. L. Motta, M. H. A. Santana

Abstract:

Humic acids (HA) were produced by a Trichoderma viride strain under submerged fermentation in a medium based on the oil palm empty fruit bunch (EFB) and the main variables of the process were optimized by using response surface methodology. A temperature of 40°C and concentrations of 50g/L EFB, 5.7g/L potato peptone and 0.11g/L (NH4)2SO4 were the optimum levels of the variables that maximize the HA production, within the physicochemical and biological limits of the process. The optimized conditions led to an experimental HA concentration of 428.4±17.5 mg/L, which validated the prediction from the statistical model of 412.0mg/L. This optimization increased about 7–fold the HA production previously reported in the literature. Additionally, the time profiles of HA production and fungal growth confirmed our previous findings that HA production preferably occurs during fungal sporulation. The present study demonstrated that T. viride successfully produced HA via the submerged fermentation of EFB and the process parameters were successfully optimized using a statistics-based response surface model. To the best of our knowledge, the present work is the first report on the optimization of HA production from EFB by a biotechnological process, whose feasibility was only pointed out in previous works.

Keywords: Empty fruit bunch, humic acids, submerged fermentation, Trichoderma viride.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2165
241 Cantilever Shoring Piles with Prestressing Strands: An Experimental Approach

Authors: Hani Mekdash, Lina Jaber, Yehia Temsah

Abstract:

Underground space is becoming a necessity nowadays, especially in highly congested urban areas. Retaining underground excavations using shoring systems is essential in order to protect adjoining structures from potential damage or collapse. Reinforced Concrete Piles (RCP) supported by multiple rows of tie-back anchors are commonly used type of shoring systems in deep excavations. However, executing anchors can sometimes be challenging because they might illegally trespass neighboring properties or get obstructed by infrastructure and other underground facilities. A technique is proposed in this paper, and it involves the addition of eccentric high-strength steel strands to the RCP section through ducts without providing the pile with lateral supports. The strands are then vertically stressed externally on the pile cap using a hydraulic jack, creating a compressive strengthening force in the concrete section. An experimental study about the behavior of the shoring wall by pre-stressed piles is presented during the execution of an open excavation in an urban area (Beirut city) followed by numerical analysis using finite element software. Based on the experimental results, this technique is proven to be cost-effective and provides flexible and sustainable construction of shoring works.

Keywords: Excavation, inclinometer, prestressing, shoring system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 516
240 Review and Evaluation of Trending Canonical Correlation Analyses-Based Brain-Computer Interface Methods

Authors: Bayar Shahab

Abstract:

The fast development of technology that has advanced neuroscience and human interaction with computers has enabled solutions to various problems and issues of this new era. The Brain-Computer Interface (BCI) has opened the door to several new research areas and have been able to provide solutions to critical and vital issues such as supporting a paralyzed patient to interact with the outside world, controlling a robot arm, playing games in VR with the brain, driving a wheelchair. This review presents the state-of-the-art methods and improvements of canonical correlation analyses (CCA), an SSVEP-based BCI method. These are the methods used to extract EEG signal features or, to be said differently, the features of interest that we are looking for in the EEG analyses. Each of the methods from oldest to newest has been discussed while comparing their advantages and disadvantages. This would create a great context and help researchers understand the most state-of-the-art methods available in this field, their pros and cons, and their mathematical representations and usage. This work makes a vital contribution to the existing field of study. It differs from other similar recently published works by providing the following: (1) stating most of the main methods used in this field in a hierarchical way, (2) explaining the pros and cons of each method and their performance, (3) presenting the gaps that exist at the end of each method that can improve the understanding and open doors to new researches or improvements. 

Keywords: BCI, CCA, SSVEP, EEG

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 590
239 Basic Research on Applying Temporary Work Engineering at the Design Phase

Authors: Jin Woong Lee, Kyuman Cho, Taehoon Kim

Abstract:

The application of constructability is increasingly required not only in the construction phase but also in the whole project stage. In particular, the proper application of construction experience and knowledge during the design phase enables the minimization of inefficiencies such as design changes and improvements in constructability during the construction phase. In order to apply knowledge effectively, engineering technology efforts should be implemented with design progress. Among many engineering technologies, engineering for temporary works, including facilities, equipment, and other related construction methods, is important to improve constructability. Therefore, as basic research, this study investigates the applicability of temporary work engineering during the design phase in the building construction industry. As a result, application of temporary work engineering has a greater impact on construction cost reduction and constructability improvement. In contrast to the existing design-bid-build method, the turn-key and CM (construct management) procurement methods currently being implemented in Korea are expected to have a significant impact on the direction of temporary work engineering. To introduce temporary work engineering, expert/professional organization training is first required, and a lack of client awareness should be preferentially improved. The results of this study are expected to be useful as reference material for the development of more effective temporary work engineering tasks and work processes in the future.

Keywords: Temporary work engineering, design phase, constructability, building construction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 971
238 Supplier Selection in a Scenario Based Stochastic Model with Uncertain Defectiveness and Delivery Lateness Rates

Authors: Abeer Amayri, Akif A. Bulgak

Abstract:

Due to today’s globalization as well as outsourcing practices of the companies, the Supply Chain (SC) performances have become more dependent on the efficient movement of material among places that are geographically dispersed, where there is more chance for disruptions. One such disruption is the quality and delivery uncertainties of outsourcing. These uncertainties could lead the products to be unsafe and, as is the case in a number of recent examples, companies may have to end up in recalling their products. As a result of these problems, there is a need to develop a methodology for selecting suppliers globally in view of risks associated with low quality and late delivery. Accordingly, we developed a two-stage stochastic model that captures the risks associated with uncertainty in quality and delivery as well as a solution procedure for the model. The stochastic model developed simultaneously optimizes supplier selection and purchase quantities under price discounts over a time horizon. In particular, our target is the study of global organizations with multiple sites and multiple overseas suppliers, where the pricing is offered in suppliers’ local currencies. Our proposed methodology is applied to a case study for a US automotive company having two assembly plants and four potential global suppliers to illustrate how the proposed model works in practice.

Keywords: Global supply chains, quality, stochastic programming, supplier selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1568
237 Spike Sorting Method Using Exponential Autoregressive Modeling of Action Potentials

Authors: Sajjad Farashi

Abstract:

Neurons in the nervous system communicate with each other by producing electrical signals called spikes. To investigate the physiological function of nervous system it is essential to study the activity of neurons by detecting and sorting spikes in the recorded signal. In this paper a method is proposed for considering the spike sorting problem which is based on the nonlinear modeling of spikes using exponential autoregressive model. The genetic algorithm is utilized for model parameter estimation. In this regard some selected model coefficients are used as features for sorting purposes. For optimal selection of model coefficients, self-organizing feature map is used. The results show that modeling of spikes with nonlinear autoregressive model outperforms its linear counterpart. Also the extracted features based on the coefficients of exponential autoregressive model are better than wavelet based extracted features and get more compact and well-separated clusters. In the case of spikes different in small-scale structures where principal component analysis fails to get separated clouds in the feature space, the proposed method can obtain well-separated cluster which removes the necessity of applying complex classifiers.

Keywords: Exponential autoregressive model, Neural data, spike sorting, time series modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1770
236 Organization Model of Semantic Document Repository and Search Techniques for Studying Information Technology

Authors: Nhon Do, Thuong Huynh, An Pham

Abstract:

Nowadays, organizing a repository of documents and resources for learning on a special field as Information Technology (IT), together with search techniques based on domain knowledge or document-s content is an urgent need in practice of teaching, learning and researching. There have been several works related to methods of organization and search by content. However, the results are still limited and insufficient to meet user-s demand for semantic document retrieval. This paper presents a solution for the organization of a repository that supports semantic representation and processing in search. The proposed solution is a model which integrates components such as an ontology describing domain knowledge, a database of document repository, semantic representation for documents and a file system; with problems, semantic processing techniques and advanced search techniques based on measuring semantic similarity. The solution is applied to build a IT learning materials management system of a university with semantic search function serving students, teachers, and manager as well. The application has been implemented, tested at the University of Information Technology, Ho Chi Minh City, Vietnam and has achieved good results.

Keywords: document retrieval system, knowledgerepresentation, document representation, semantic search, ontology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1709
235 Implication of the Exchange-Correlation on Electromagnetic Wave Propagation in Single-Wall Carbon Nanotubes

Authors: A. Abdikian

Abstract:

Using the linearized quantum hydrodynamic model (QHD) and by considering the role of quantum parameter (Bohm’s potential) and electron exchange-correlation potential in conjunction with Maxwell’s equations, electromagnetic wave propagation in a single-walled carbon nanotubes was studied. The electronic excitations are described. By solving the mentioned equations with appropriate boundary conditions and by assuming the low-frequency electromagnetic waves, two general expressions of dispersion relations are derived for the transverse magnetic (TM) and transverse electric (TE) modes, respectively. The dispersion relations are analyzed numerically and it was found that the dependency of dispersion curves with the exchange-correlation effects (which have been ignored in previous works) in the low frequency would be limited. Moreover, it has been realized that asymptotic behaviors of the TE and TM modes are similar in single wall carbon nanotubes (SWCNTs). The results show that by adding the function of electron exchange-correlation potential lead to the phenomena and make to extend the validity range of QHD model. The results can be important in the study of collective phenomena in nanostructures.

Keywords: Transverse magnetic, transverse electric, quantum hydrodynamic model, electron exchange-correlation potential, single-wall carbon nanotubes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1079
234 M2LGP: Mining Multiple Level Gradual Patterns

Authors: Yogi Satrya Aryadinata, Anne Laurent, Michel Sala

Abstract:

Gradual patterns have been studied for many years as they contain precious information. They have been integrated in many expert systems and rule-based systems, for instance to reason on knowledge such as “the greater the number of turns, the greater the number of car crashes”. In many cases, this knowledge has been considered as a rule “the greater the number of turns → the greater the number of car crashes” Historically, works have thus been focused on the representation of such rules, studying how implication could be defined, especially fuzzy implication. These rules were defined by experts who were in charge to describe the systems they were working on in order to turn them to operate automatically. More recently, approaches have been proposed in order to mine databases for automatically discovering such knowledge. Several approaches have been studied, the main scientific topics being: how to determine what is an relevant gradual pattern, and how to discover them as efficiently as possible (in terms of both memory and CPU usage). However, in some cases, end-users are not interested in raw level knowledge, and are rather interested in trends. Moreover, it may be the case that no relevant pattern can be discovered at a low level of granularity (e.g. city), whereas some can be discovered at a higher level (e.g. county). In this paper, we thus extend gradual pattern approaches in order to consider multiple level gradual patterns. For this purpose, we consider two aggregation policies, namely horizontal and vertical.

Keywords: Gradual Pattern.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1500
233 A Budget and Deadline Constrained Fault Tolerant Load Balanced Scheduling Algorithm for Computational Grids

Authors: P. Keerthika, P. Suresh

Abstract:

Grid is an environment with millions of resources which are dynamic and heterogeneous in nature. A computational grid is one in which the resources are computing nodes and is meant for applications that involves larger computations. A scheduling algorithm is said to be efficient if and only if it performs better resource allocation even in case of resource failure. Resource allocation is a tedious issue since it has to consider several requirements such as system load, processing cost and time, user’s deadline and resource failure. This work attempts in designing a resource allocation algorithm which is cost-effective and also targets at load balancing, fault tolerance and user satisfaction by considering the above requirements. The proposed Budget Constrained Load Balancing Fault Tolerant algorithm with user satisfaction (BLBFT) reduces the schedule makespan, schedule cost and task failure rate and improves resource utilization. Evaluation of the proposed BLBFT algorithm is done using Gridsim toolkit and the results are compared with the algorithms which separately concentrates on all these factors. The comparison results ensure that the proposed algorithm works better than its counterparts.

Keywords: Grid Scheduling, Load Balancing, fault tolerance, makespan, cost, resource utilization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2129
232 Adaptive Neuro-Fuzzy Inference System for Financial Trading using Intraday Seasonality Observation Model

Authors: A. Kablan

Abstract:

The prediction of financial time series is a very complicated process. If the efficient market hypothesis holds, then the predictability of most financial time series would be a rather controversial issue, due to the fact that the current price contains already all available information in the market. This paper extends the Adaptive Neuro Fuzzy Inference System for High Frequency Trading which is an expert system that is capable of using fuzzy reasoning combined with the pattern recognition capability of neural networks to be used in financial forecasting and trading in high frequency. However, in order to eliminate unnecessary input in the training phase a new event based volatility model was proposed. Taking volatility and the scaling laws of financial time series into consideration has brought about the development of the Intraday Seasonality Observation Model. This new model allows the observation of specific events and seasonalities in data and subsequently removes any unnecessary data. This new event based volatility model provides the ANFIS system with more accurate input and has increased the overall performance of the system.

Keywords: Adaptive Neuro-fuzzy Inference system, High Frequency Trading, Intraday Seasonality Observation Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3394
231 Power System with PSS and FACTS Controller: Modelling, Simulation and Simultaneous Tuning Employing Genetic Algorithm

Authors: Sidhartha Panda, Narayana Prasad Padhy

Abstract:

This paper presents a systematic procedure for modelling and simulation of a power system installed with a power system stabilizer (PSS) and a flexible ac transmission system (FACTS)-based controller. For the design purpose, the model of example power system which is a single-machine infinite-bus power system installed with the proposed controllers is developed in MATLAB/SIMULINK. In the developed model synchronous generator is represented by model 1.1. which includes both the generator main field winding and the damper winding in q-axis so as to evaluate the impact of PSS and FACTS-based controller on power system stability. The model can be can be used for teaching the power system stability phenomena, and also for research works especially to develop generator controllers using advanced technologies. Further, to avoid adverse interactions, PSS and FACTS-based controller are simultaneously designed employing genetic algorithm (GA). The non-linear simulation results are presented for the example power system under various disturbance conditions to validate the effectiveness of the proposed modelling and simultaneous design approach.

Keywords: Genetic algorithm, modelling and simulation, MATLAB/SIMULINK, power system stabilizer, thyristor controlledseries compensator, simultaneous design, power system stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3156
230 A Reusability Evaluation Model for OO-Based Software Components

Authors: Parvinder S. Sandhu, Hardeep Singh

Abstract:

The requirement to improve software productivity has promoted the research on software metric technology. There are metrics for identifying the quality of reusable components but the function that makes use of these metrics to find reusability of software components is still not clear. These metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the component and hence improve the productivity due to probabilistic increase in the reuse level. CK metric suit is most widely used metrics for the objectoriented (OO) software; we critically analyzed the CK metrics, tried to remove the inconsistencies and devised the framework of metrics to obtain the structural analysis of OO-based software components. Neural network can learn new relationships with new input data and can be used to refine fuzzy rules to create fuzzy adaptive system. Hence, Neuro-fuzzy inference engine can be used to evaluate the reusability of OO-based component using its structural attributes as inputs. In this paper, an algorithm has been proposed in which the inputs can be given to Neuro-fuzzy system in form of tuned WMC, DIT, NOC, CBO , LCOM values of the OO software component and output can be obtained in terms of reusability. The developed reusability model has produced high precision results as expected by the human experts.

Keywords: CK-Metric, ID3, Neuro-fuzzy, Reusability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1818
229 A Decision Support System Based on Leprosy Scales

Authors: Dennys Robson Girardi, Hugo Bulegon, Claudia Maria Moro Barra

Abstract:

Leprosy is an infectious disease caused by Mycobacterium Leprae, this disease, generally, compromises the neural fibers, leading to the development of disability. Disabilities are changes that limit daily activities or social life of a normal individual. When comes to leprosy, the study of disability considered the functional limitation (physical disabilities), the limitation of activity and social participation, which are measured respectively by the scales: EHF, SALSA and PARTICIPATION SCALE. The objective of this work is to propose an on-line monitoring of leprosy patients, which is based on information scales EHF, SALSA and PARTICIPATION SCALE. It is expected that the proposed system is applied in monitoring the patient during treatment and after healing therapy of the disease. The correlations that the system is between the scales create a variety of information, presented the state of the patient and full of changes or reductions in disability. The system provides reports with information from each of the scales and the relationships that exist between them. This way, health professionals, with access to patient information, can intervene with techniques for the Prevention of Disability. Through the automated scale, the system shows the level of the patient and allows the patient, or the responsible, to take a preventive measure. With an online system, it is possible take the assessments and monitor patients from anywhere.

Keywords: Leprosy, Medical Informatics, Decision SupportSystem, Disability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2047
228 Enhanced Clustering Analysis and Visualization Using Kohonen's Self-Organizing Feature Map Networks

Authors: Kasthurirangan Gopalakrishnan, Siddhartha Khaitan, Anshu Manik

Abstract:

Cluster analysis is the name given to a diverse collection of techniques that can be used to classify objects (e.g. individuals, quadrats, species etc). While Kohonen's Self-Organizing Feature Map (SOFM) or Self-Organizing Map (SOM) networks have been successfully applied as a classification tool to various problem domains, including speech recognition, image data compression, image or character recognition, robot control and medical diagnosis, its potential as a robust substitute for clustering analysis remains relatively unresearched. SOM networks combine competitive learning with dimensionality reduction by smoothing the clusters with respect to an a priori grid and provide a powerful tool for data visualization. In this paper, SOM is used for creating a toroidal mapping of two-dimensional lattice to perform cluster analysis on results of a chemical analysis of wines produced in the same region in Italy but derived from three different cultivators, referred to as the “wine recognition data" located in the University of California-Irvine database. The results are encouraging and it is believed that SOM would make an appealing and powerful decision-support system tool for clustering tasks and for data visualization.

Keywords: Artificial neural networks, cluster analysis, Kohonen maps, wine recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2121
227 Value Engineering and Its Effect in Reduction of Industrial Organization Energy Expenses

Authors: Habibollah Najafi, Amir Abbas Yazdani, Hosseinali Nahavandi

Abstract:

The review performed on the condition of energy consumption & rate in Iran, shows that unfortunately the subject of optimization and conservation of energy in active industries of country lacks a practical & effective method and in most factories, the energy consumption and rate is more than in similar industries of industrial countries. The increasing demand of electrical energy and the overheads which it imposes on the organization, forces companies to search for suitable approaches to optimize energy consumption and demand management. Application of value engineering techniques is among these approaches. Value engineering is considered a powerful tool for improving profitability. These tools are used for reduction of expenses, increasing profits, quality improvement, increasing market share, performing works in shorter durations, more efficient utilization of sources & etc. In this article, we shall review the subject of value engineering and its capabilities for creating effective transformations in industrial organizations, in order to reduce energy costs & the results have been investigated and described during a case study in Mazandaran wood and paper industries, the biggest consumer of energy in north of Iran, for the purpose of presenting the effects of performed tasks in optimization of energy consumption by utilizing value engineering techniques in one case study.

Keywords: Value Engineering (VE), Expense, Energy, Industrial

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2266
226 Domain Driven Design vs Soft Domain Driven Design Frameworks

Authors: Mohammed Salahat, Steve Wade

Abstract:

This paper presents and compares the SSDDD “Systematic Soft Domain Driven Design Framework” to DDD “Domain Driven Design Framework” as a soft system approach of information systems development. The framework use SSM as a guiding methodology within which we have embedded a sequence of design tasks based on the UML leading to the implementation of a software system using the Naked Objects framework. This framework has been used in action research projects that have involved the investigation and modelling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within this framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to develop the domain model using UML for the given business domain. The framework is proposed and evaluated in our previous works, a comparison between SSDDD and DDD is presented in this paper, to show how SSDDD improved DDD as an approach to modelling and implementing business domain perspectives for Information Systems Development. The comparison process, the results, and the improvements are presented in the following sections of this paper.

Keywords: SSM, UML, domain-driven design, soft domain-driven design, naked objects, soft language, information retrieval, multimethodology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1978
225 A New Design of Mobile Thermoelectric Power Generation System

Authors: Hsin-Hung Chang, Jin-Lung Guan, Ming-Ta Yang

Abstract:

This paper presents a compact thermoelectric power generator system based on temperature difference across the element. The system can transfer the burning heat energy to electric energy directly. The proposed system has a thermoelectric generator and a power control box. In the generator, there are 4 thermoelectric modules (TEMs), each of which uses 2 thermoelectric chips (TEs) and 2 cold sinks, 1 thermal absorber, and 1 thermal conduction flat board. In the power control box, there are 1 storing energy device, 1 converter, and 1 inverter. The total net generating power is about 11W. This system uses commercial portable gas stoves or burns timber or the coal as the heat source, which is easily obtained. It adopts solid-state thermoelectric chips as heat inverter parts. The system has the advantages of being light-weight, quite, and mobile, requiring no maintenance, and havng easily-supplied heat source. The system can be used a as long as burning is allowed. This system works well for highly-mobilized outdoors situations by providing a power for illumination, entertainment equipment or the wireless equipment at refuge. Under heavy storms such as typhoon, when the solar panels become ineffective and the wind-powered machines malfunction, the thermoelectric power generator can continue providing the vital power.

Keywords: Thermoelectric chip, seekback effect, thermo electric power generator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2801
224 Analysis of Event-related Response in Human Visual Cortex with fMRI

Authors: Ayesha Zaman, Tanvir Atahary, Shahida Rafiq

Abstract:

Functional Magnetic Resonance Imaging(fMRI) is a noninvasive imaging technique that measures the hemodynamic response related to neural activity in the human brain. Event-related functional magnetic resonance imaging (efMRI) is a form of functional Magnetic Resonance Imaging (fMRI) in which a series of fMRI images are time-locked to a stimulus presentation and averaged together over many trials. Again an event related potential (ERP) is a measured brain response that is directly the result of a thought or perception. Here the neuronal response of human visual cortex in normal healthy patients have been studied. The patients were asked to perform a visual three choice reaction task; from the relative response of each patient corresponding neuronal activity in visual cortex was imaged. The average number of neurons in the adult human primary visual cortex, in each hemisphere has been estimated at around 140 million. Statistical analysis of this experiment was done with SPM5(Statistical Parametric Mapping version 5) software. The result shows a robust design of imaging the neuronal activity of human visual cortex.

Keywords: Echo Planner Imaging, Event related Response, General Linear Model, Visual Neuronal Response.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1455
223 A Deep-Learning Based Prediction of Pancreatic Adenocarcinoma with Electronic Health Records from the State of Maine

Authors: Xiaodong Li, Peng Gao, Chao-Jung Huang, Shiying Hao, Xuefeng B. Ling, Yongxia Han, Yaqi Zhang, Le Zheng, Chengyin Ye, Modi Liu, Minjie Xia, Changlin Fu, Bo Jin, Karl G. Sylvester, Eric Widen

Abstract:

Predicting the risk of Pancreatic Adenocarcinoma (PA) in advance can benefit the quality of care and potentially reduce population mortality and morbidity. The aim of this study was to develop and prospectively validate a risk prediction model to identify patients at risk of new incident PA as early as 3 months before the onset of PA in a statewide, general population in Maine. The PA prediction model was developed using Deep Neural Networks, a deep learning algorithm, with a 2-year electronic-health-record (EHR) cohort. Prospective results showed that our model identified 54.35% of all inpatient episodes of PA, and 91.20% of all PA that required subsequent chemoradiotherapy, with a lead-time of up to 3 months and a true alert of 67.62%. The risk assessment tool has attained an improved discriminative ability. It can be immediately deployed to the health system to provide automatic early warnings to adults at risk of PA. It has potential to identify personalized risk factors to facilitate customized PA interventions.

Keywords: Cancer prediction, deep learning, electronic health records, pancreatic adenocarcinoma.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 846
222 Adaptive Block State Update Method for Separating Background

Authors: Youngsuck Ji, Youngjoon Han, Hernsoo Hahn

Abstract:

In this paper, we proposed the robust mobile object detection method for light effect in the night street image block based updating reference background model using block state analysis. Experiment image is acquired sequence color video from steady camera. When suddenly appeared artificial illumination, reference background model update this information such as street light, sign light. Generally natural illumination is change by temporal, but artificial illumination is suddenly appearance. So in this paper for exactly detect artificial illumination have 2 state process. First process is compare difference between current image and reference background by block based, it can know changed blocks. Second process is difference between current image-s edge map and reference background image-s edge map, it possible to estimate illumination at any block. This information is possible to exactly detect object, artificial illumination and it was generating reference background more clearly. Block is classified by block-state analysis. Block-state has a 4 state (i.e. transient, stationary, background, artificial illumination). Fig. 1 is show characteristic of block-state respectively [1]. Experimental results show that the presented approach works well in the presence of illumination variance.

Keywords: Block-state, Edge component, Reference backgroundi, Artificial illumination.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1320
221 SIFT Accordion: A Space-Time Descriptor Applied to Human Action Recognition

Authors: Olfa.Ben Ahmed, Mahmoud. Mejdoub, Chokri. Ben Amar

Abstract:

Recognizing human action from videos is an active field of research in computer vision and pattern recognition. Human activity recognition has many potential applications such as video surveillance, human machine interaction, sport videos retrieval and robot navigation. Actually, local descriptors and bag of visuals words models achieve state-of-the-art performance for human action recognition. The main challenge in features description is how to represent efficiently the local motion information. Most of the previous works focus on the extension of 2D local descriptors on 3D ones to describe local information around every interest point. In this paper, we propose a new spatio-temporal descriptor based on a spacetime description of moving points. Our description is focused on an Accordion representation of video which is well-suited to recognize human action from 2D local descriptors without the need to 3D extensions. We use the bag of words approach to represent videos. We quantify 2D local descriptor describing both temporal and spatial features with a good compromise between computational complexity and action recognition rates. We have reached impressive results on publicly available action data set

Keywords: Accordion, Bag of Features, Human action, Motion, Moving point, Space-Time Descriptor, SIFT, Video.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2107
220 Analysis of Incidences of Collapsed Buildings in the City of Douala, Cameroon from 2011-2020

Authors: T. G. L. J. Bikoko, J. C. Tchamba, S. Amziane

Abstract:

This study focuses on the problem of collapsed buildings within the city of Douala over the past ten years, and more precisely within the period from 2011 to 2020. It was carried out in a bid to ascertain the real causes of this phenomenon, which has become recurrent in the leading economic city of Cameroon. To achieve this, it was first necessary to review some works dealing with construction materials and technology as well as some case histories of structural collapse within the city. Thereafter, a statistical study was carried out on the results obtained. It was found that the causes of building collapses in the city of Douala are: Neglect of administrative procedures, use of poor quality materials, poor composition and confectioning of concrete, lack of Geotechnical study, lack of structural analysis and design, corrosion of the reinforcement bars, poor maintenance in buildings, and other causes. Out of the 46 cases of failure and collapse of buildings within the city of Douala, 7 of these were identified to have had no geotechnical study carried out, giving a percentage of 15.22%. It was also observed that out of the 46 cases of structural failure, 6 were as a result of lack of proper structural analysis and design giving a percentage of 13.04%. Subsequently, recommendations and suggestions are made in a bid to placing particular emphasis on the choice of materials, the manufacture and casting of concrete as well as the placement of the required reinforcements. All this guarantees the stability of a building.

Keywords: collapse buildings, Douala, structural collapse, Cameroon

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 853