Search results for: Generalized matrix approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6169

Search results for: Generalized matrix approach

3829 Experimental and Theoretical Study on Hygrothermal Aging Effect on Mechanical Behavior of Fiber Reinforced Plastic Laminates

Authors: S. Larbi, R. Bensaada, S. Djebali, A. Bilek

Abstract:

The manufacture of composite parts is a major issue in many industrial domains. Polymer composite materials are ideal for structural applications where high strength-to-weight and stiffness-to-weight ratios are required. However, exposition to extreme environment conditions (temperature, humidity) affects mechanical properties of organic composite materials and lead to an undesirable degradation. Aging mechanisms in organic matrix are very diverse and vary according to the polymer and the aging conditions such as temperature, humidity etc. This paper studies the hygrothermal aging effect on the mechanical properties of fiber reinforced plastics laminates at 40 °C in different environment exposure. Two composite materials are used to conduct the study (carbon fiber/epoxy and glass fiber/vinyl ester with two stratifications for both the materials [904/04] and [454/04]). The experimental procedure includes a mechanical characterization of the materials in a virgin state and exposition of specimens to two environments (seawater and demineralized water). Absorption kinetics for the two materials and both the stratifications are determined. Three-point bending test is performed on the aged materials in order to determine the hygrothermal effect on the mechanical properties of the materials.

Keywords: FRP laminates, hygrothermal aging, mechanical properties, theory of laminates.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1232
3828 Multi-Disciplinary Optimisation Methodology for Aircraft Load Prediction

Authors: Sudhir Kumar Tiwari

Abstract:

The paper demonstrates a methodology that can be used at an early design stage of any conventional aircraft. This research activity assesses the feasibility derivation of methodology for aircraft loads estimation during the various phases of design for a transport category aircraft by utilizing potential of using commercial finite element analysis software, which may drive significant time saving. Early Design phase have limited data and quick changing configuration results in handling of large number of load cases. It is useful to idealize the aircraft as a connection of beams, which can be very accurately modelled using finite element analysis (beam elements). This research explores the correct approach towards idealizing an aircraft using beam elements. FEM Techniques like inertia relief were studied for implementation during course of work. The correct boundary condition technique envisaged for generation of shear force, bending moment and torque diagrams for the aircraft. The possible applications of this approach are the aircraft design process, which have been investigated.

Keywords: Multi-disciplinary optimization, aircraft load, finite element analysis, Stick Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1130
3827 A Black-box Approach for Response Quality Evaluation of Conversational Agent Systems

Authors: Ong Sing Goh, C. Ardil, Wilson Wong, Chun Che Fung

Abstract:

The evaluation of conversational agents or chatterbots question answering systems is a major research area that needs much attention. Before the rise of domain-oriented conversational agents based on natural language understanding and reasoning, evaluation is never a problem as information retrieval-based metrics are readily available for use. However, when chatterbots began to become more domain specific, evaluation becomes a real issue. This is especially true when understanding and reasoning is required to cater for a wider variety of questions and at the same time to achieve high quality responses. This paper discusses the inappropriateness of the existing measures for response quality evaluation and the call for new standard measures and related considerations are brought forward. As a short-term solution for evaluating response quality of conversational agents, and to demonstrate the challenges in evaluating systems of different nature, this research proposes a blackbox approach using observation, classification scheme and a scoring mechanism to assess and rank three example systems, AnswerBus, START and AINI.

Keywords: Evaluation, conversational agents, Response Quality, chatterbots

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927
3826 Design of Static Synchronous Series Compensator Based Damping Controller Employing Real Coded Genetic Algorithm

Authors: S.C.Swain, A.K.Balirsingh, S. Mahapatra, S. Panda

Abstract:

This paper presents a systematic approach for designing Static Synchronous Series Compensator (SSSC) based supplementary damping controllers for damping low frequency oscillations in a single-machine infinite-bus power system. The design problem of the proposed controller is formulated as an optimization problem and RCGA is employed to search for optimal controller parameters. By minimizing the time-domain based objective function, in which the deviation in the oscillatory rotor speed of the generator is involved; stability performance of the system is improved. Simulation results are presented and compared with a conventional method of tuning the damping controller parameters to show the effectiveness and robustness of the proposed design approach.

Keywords: Low frequency Oscillations, Phase CompensationTechnique, Real Coded Genetic Algorithm, Single-machine InfiniteBus Power System, Static Synchronous Series Compensator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2502
3825 DocPro: A Framework for Processing Semantic and Layout Information in Business Documents

Authors: Ming-Jen Huang, Chun-Fang Huang, Chiching Wei

Abstract:

With the recent advance of the deep neural network, we observe new applications of NLP (natural language processing) and CV (computer vision) powered by deep neural networks for processing business documents. However, creating a real-world document processing system needs to integrate several NLP and CV tasks, rather than treating them separately. There is a need to have a unified approach for processing documents containing textual and graphical elements with rich formats, diverse layout arrangement, and distinct semantics. In this paper, a framework that fulfills this unified approach is presented. The framework includes a representation model definition for holding the information generated by various tasks and specifications defining the coordination between these tasks. The framework is a blueprint for building a system that can process documents with rich formats, styles, and multiple types of elements. The flexible and lightweight design of the framework can help build a system for diverse business scenarios, such as contract monitoring and reviewing.

Keywords: Document processing, framework, formal definition, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 639
3824 The Future of Hospitals: A Systematic Review in the Field of Architectural Design with a Disruptive Research and Development Approach

Authors: María Araya Léon, Ainoa Abella, Aura Murillo, Ricardo Guasch, Laura Clèries

Abstract:

This article aims to examine scientific theory framed within the term hospitals of the future from a multidisciplinary and cross-sectional perspective. To understand the connection that the various cross-sectional areas, we studied have with architectural spaces and to determine the future outlook of the works examined and how they can be classified into the categories of need/solution, evolution/revolution, collective/individual, and preventive/corrective. The changes currently taking place within the context of healthcare demonstrate how important these projects are and the need for companies to face future changes. A systematic review has been carried out focused on what will the hospitals of the future be like in relation to the elements that form part of their use, design, and architectural space experience, using the WOS database from 2016 to 2019. The large number of works about sensoring & big data and the scarce amount related to the area of materials is worth highlighting. Furthermore, no growth concerning future issues is envisaged over time. Regarding classifications, the articles we reviewed address evolutionary and collective solutions more, and in terms of preventive and corrective solutions, they were found at a similar level. Although our research focused on the future of hospitals, there is little evidence representing this approach. We also detected that, given the relevance of the research on how the built environment influences human health and well-being, these studies should be promoted within the context of healthcare. This article allows to find evidence on the future perspective from within the domain of hospital architecture, in order to create bridges between the productive sector of architecture and scientific theory. This will make it possible to detect R&D opportunities in each analyzed cross-section.

Keywords: Hospitals, trends, architectural space, disruptive approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 299
3823 Effective Communication with the Czech Customers 50+ in the Financial Market

Authors: K. Matušínská, H. Starzyczná, M. Stoklasa

Abstract:

The paper deals with finding and describing of the effective marketing communication forms relating to the segment 50+ in the financial market in the Czech Republic. The segment 50+ can be seen as a great marketing potential in the future but unfortunately the Czech financial institutions haven´t still reacted enough to this fact and they haven´t prepared appropriate marketing programs for this customers´ segment. Demographic aging is a fundamental characteristic of the current European population evolution but the perspective of further population aging is more noticeable in the Czech Republic. This paper is based on data from one part of primary marketing research. Paper determinates the basic problem areas as well as definition of marketing communication in the financial market, defining the primary research problem, hypothesis and primary research methodology. Finally suitable marketing communication approach to selected sub-segment at age of 50-60 years is proposed according to marketing research findings.

Keywords: Population aging in the Czech Republic, segment 50+, financial services, marketing communication, marketing research, marketing communication approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1222
3822 VLSI Design of 2-D Discrete Wavelet Transform for Area-Efficient and High-Speed Image Computing

Authors: Mountassar Maamoun, Mehdi Neggazi, Abdelhamid Meraghni, Daoud Berkani

Abstract:

This paper presents a VLSI design approach of a highspeed and real-time 2-D Discrete Wavelet Transform computing. The proposed architecture, based on new and fast convolution approach, reduces the hardware complexity in addition to reduce the critical path to the multiplier delay. Furthermore, an advanced twodimensional (2-D) discrete wavelet transform (DWT) implementation, with an efficient memory area, is designed to produce one output in every clock cycle. As a result, a very highspeed is attained. The system is verified, using JPEG2000 coefficients filters, on Xilinx Virtex-II Field Programmable Gate Array (FPGA) device without accessing any external memory. The resulting computing rate is up to 270 M samples/s and the (9,7) 2-D wavelet filter uses only 18 kb of memory (16 kb of first-in-first-out memory) with 256×256 image size. In this way, the developed design requests reduced memory and provide very high-speed processing as well as high PSNR quality.

Keywords: Discrete Wavelet Transform (DWT), Fast Convolution, FPGA, VLSI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1966
3821 Semi-automatic Construction of Ontology-based CBR System for Knowledge Integration

Authors: Junjie Gao, Guishi Deng

Abstract:

In order to integrate knowledge in heterogeneous case-based reasoning (CBR) systems, ontology-based CBR system has become a hot topic. To solve the facing problems of ontology-based CBR system, for example, its architecture is nonstandard, reusing knowledge in legacy CBR is deficient, ontology construction is difficult, etc, we propose a novel approach for semi-automatically construct ontology-based CBR system whose architecture is based on two-layer ontology. Domain knowledge implied in legacy case bases can be mapped from relational database schema and knowledge items to relevant OWL local ontology automatically by a mapping algorithm with low time-complexity. By concept clustering based on formal concept analysis, computing concept equation measure and concept inclusion measure, some suggestions about enriching or amending concept hierarchy of OWL local ontologies are made automatically that can aid designers to achieve semi-automatic construction of OWL domain ontology. Validation of the approach is done by an application example.

Keywords: OWL ontology, Case-based Reasoning, FormalConcept Analysis, Knowledge Integration

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2012
3820 Object Detection Based on Plane Segmentation and Features Matching for a Service Robot

Authors: António J. R. Neves, Rui Garcia, Paulo Dias, Alina Trifan

Abstract:

With the aging of the world population and the continuous growth in technology, service robots are more and more explored nowadays as alternatives to healthcare givers or personal assistants for the elderly or disabled people. Any service robot should be capable of interacting with the human companion, receive commands, navigate through the environment, either known or unknown, and recognize objects. This paper proposes an approach for object recognition based on the use of depth information and color images for a service robot. We present a study on two of the most used methods for object detection, where 3D data is used to detect the position of objects to classify that are found on horizontal surfaces. Since most of the objects of interest accessible for service robots are on these surfaces, the proposed 3D segmentation reduces the processing time and simplifies the scene for object recognition. The first approach for object recognition is based on color histograms, while the second is based on the use of the SIFT and SURF feature descriptors. We present comparative experimental results obtained with a real service robot.

Keywords: Service Robot, Object Recognition, 3D Sensors, Plane Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1674
3819 Enhancing Rural Agricultural Value Chains through Electric Mobility Services in Ethiopia

Authors: Clemens Pizzinini, Philipp Rosner, David Ziegler, Markus Lienkamp

Abstract:

Transportation is a constitutional part of most supply and value chains in modern economies. Smallholder farmers in rural Ethiopia face severe challenges along their supply and value chains. In particular, suitable, affordable, and available transport services are in high demand. To develop context-specific technical solutions, a problem-to-solution methodology based on the interaction with technology is developed. With this approach, we fill the gap between proven transportation assessment frameworks and general user-centered techniques. Central to our approach is an electric test vehicle that is implemented in rural supply and value chains for research, development, and testing. Based on our objective and the derived methodological requirements, a set of existing methods is  selected. Local partners are integrated in an organizational framework that executes major parts of this research endeavour in Arsi Zone, Oromia Region, Ethiopia.

Keywords: Agricultural value chain, participatory methods, agile methods, sub-Saharan Africa, Ethiopia, electric vehicle, transport service.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 160
3818 Robot Movement Using the Trust Region Policy Optimization

Authors: Romisaa Ali

Abstract:

The Policy Gradient approach is a subset of the Deep Reinforcement Learning (DRL) combines Deep Neural Networks (DNN) with Reinforcement Learning (RL). This approach finds the optimal policy of robot movement, based on the experience it gains from interaction with its environment. Unlike previous policy gradient algorithms, which were unable to handle the two types of error variance and bias introduced by the DNN model due to over- or underestimation, this algorithm is capable of handling both types of error variance and bias. This article will discuss the state-of-the-art SOTA policy gradient technique, trust region policy optimization (TRPO), by applying this method in various environments compared to another policy gradient method, the Proximal Policy Optimization (PPO), to explain their robust optimization, using this SOTA to gather experience data during various training phases after observing the impact of hyper-parameters on neural network performance.

Keywords: Deep neural networks, deep reinforcement learning, Proximal Policy Optimization, state-of-the-art, trust region policy optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 184
3817 Towards End-To-End Disease Prediction from Raw Metagenomic Data

Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker

Abstract:

Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.

Keywords: Metagenomics, phenotype prediction, deep learning, embeddings, multiple instance learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 910
3816 Degraded Document Analysis and Extraction of Original Text Document: An Approach without Optical Character Recognition

Authors: L. Hamsaveni, Navya Prakash, Suresha

Abstract:

Document Image Analysis recognizes text and graphics in documents acquired as images. An approach without Optical Character Recognition (OCR) for degraded document image analysis has been adopted in this paper. The technique involves document imaging methods such as Image Fusing and Speeded Up Robust Features (SURF) Detection to identify and extract the degraded regions from a set of document images to obtain an original document with complete information. In case, degraded document image captured is skewed, it has to be straightened (deskew) to perform further process. A special format of image storing known as YCbCr is used as a tool to convert the Grayscale image to RGB image format. The presented algorithm is tested on various types of degraded documents such as printed documents, handwritten documents, old script documents and handwritten image sketches in documents. The purpose of this research is to obtain an original document for a given set of degraded documents of the same source.

Keywords: Grayscale image format, image fusing, SURF detection, YCbCr image format.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1155
3815 Retrospective Synthetic Focusing with Correlation Weighting for Very High Frame Rate Ultrasound

Authors: Chang-Lin Hu, Yao-You Cheng, Meng-Lin Li

Abstract:

The need of high frame-rate imaging has been triggered by the new applications of ultrasound imaging to transient elastography and real-time 3D ultrasound. Using plane wave excitation (PWE) is one of the methods to achieve very high frame-rate imaging since an image can be formed with a single insonification. However, due to the lack of transmit focusing, the image quality with PWE is lower compared with those using conventional focused transmission. To solve this problem, we propose a filter-retrieved transmit focusing (FRF) technique combined with cross-correlation weighting (FRF+CC weighting) for high frame-rate imaging with PWE. A restrospective focusing filter is designed to simultaneously minimize the predefined sidelobe energy associated with single PWE and the filter energy related to the signal-to-noise-ratio (SNR). This filter attempts to maintain the mainlobe signals and to reduce the sidelobe ones, which gives similar mainlobe signals and different sidelobes between the original PWE and the FRF baseband data. Normalized cross-correlation coefficient at zero lag is calculated to quantify the degree of similarity at each imaging point and used as a weighting matrix to the FRF baseband data to further suppress sidelobes, thus improving the filter-retrieved focusing quality.

Keywords: retrospective synthetic focusing, high frame rate, correlation weighting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1852
3814 Analytical Study of Applying the Account Aggregation Approach in E-Banking Services

Authors: A. Al Drees, A. Alahmari, R. Almuwayshir

Abstract:

The advanced information technology is becoming an important factor in the development of financial services industry, especially the banking industry. It has introduced new ways of delivering banking to the customer, such as Internet Banking. Banks began to look at electronic banking (e-banking) as a means to replace some of their traditional branch functions using the Internet as a new distribution channel. Some consumers have at least more than one account, and across banks, and access these accounts using e-banking services. To look at the current net worth position, customers have to login to each of their accounts and get the details and work on consolidation. This not only takes ample time but it is a repetitive activity at a specified frequency. To address this point, an account aggregation concept is added as a solution. E-banking account aggregation, as one of the e-banking types, appeared to build a stronger relationship with customers. Account Aggregation Service generally refers to a service that allows customers to manage their bank accounts maintained in different institutions through a common Internet banking operating a platform, with a high concern to security and privacy. This paper presents an overview of an e-banking account aggregation approach as a new service in the e-banking field.

Keywords: E-banking, security, account aggregation, enterprise application development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1549
3813 Localization of Anatomical Landmarks in Head CT Images for Image to Patient Registration

Authors: M. Ovinis, D. Kerr, K. Bouazza-Marouf, M. Vloeberghs

Abstract:

The use of anatomical landmarks as a basis for image to patient registration is appealing because the registration may be performed retrospectively. We have previously proposed the use of two anatomical soft tissue landmarks of the head, the canthus (corner of the eye) and the tragus (a small, pointed, cartilaginous flap of the ear), as a registration basis for an automated CT image to patient registration system, and described their localization in patient space using close range photogrammetry. In this paper, the automatic localization of these landmarks in CT images, based on their curvature saliency and using a rule based system that incorporates prior knowledge of their characteristics, is described. Existing approaches to landmark localization in CT images are predominantly semi-automatic and primarily for localizing internal landmarks. To validate our approach, the positions of the landmarks localized automatically and manually in near isotropic CT images of 102 patients were compared. The average difference was 1.2mm (std = 0.9mm, max = 4.5mm) for the medial canthus and 0.8mm (std = 0.6mm, max = 2.6mm) for the tragus. The medial canthus and tragus can be automatically localized in CT images, with performance comparable to manual localization, based on the approach presented.

Keywords: Anatomical Landmarks, CT, Localization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3325
3812 Application of Novel Conserving Immersed Boundary Method to Moving Boundary Problem

Authors: S. N. Hosseini, S. M. H. Karimian

Abstract:

A new conserving approach in the context of Immersed Boundary Method (IBM) is presented to simulate one dimensional, incompressible flow in a moving boundary problem. The method employs control volume scheme to simulate the flow field. The concept of ghost node is used at the boundaries to conserve the mass and momentum equations. The Present method implements the conservation laws in all cells including boundary control volumes. Application of the method is studied in a test case with moving boundary. Comparison between the results of this new method and a sharp interface (Image Point Method) IBM algorithm shows a well distinguished improvement in both pressure and velocity fields of the present method. Fluctuations in pressure field are fully resolved in this proposed method. This approach expands the IBM capability to simulate flow field for variety of problems by implementing conservation laws in a fully Cartesian grid compared to other conserving methods.

Keywords: Immersed Boundary Method, conservation of mass and momentum laws, moving boundary, boundary condition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1990
3811 Modeling Engagement with Multimodal Multisensor Data: The Continuous Performance Test as an Objective Tool to Track Flow

Authors: Mohammad H. Taheri, David J. Brown, Nasser Sherkat

Abstract:

Engagement is one of the most important factors in determining successful outcomes and deep learning in students. Existing approaches to detect student engagement involve periodic human observations that are subject to inter-rater reliability. Our solution uses real-time multimodal multisensor data labeled by objective performance outcomes to infer the engagement of students. The study involves four students with a combined diagnosis of cerebral palsy and a learning disability who took part in a 3-month trial over 59 sessions. Multimodal multisensor data were collected while they participated in a continuous performance test. Eye gaze, electroencephalogram, body pose, and interaction data were used to create a model of student engagement through objective labeling from the continuous performance test outcomes. In order to achieve this, a type of continuous performance test is introduced, the Seek-X type. Nine features were extracted including high-level handpicked compound features. Using leave-one-out cross-validation, a series of different machine learning approaches were evaluated. Overall, the random forest classification approach achieved the best classification results. Using random forest, 93.3% classification for engagement and 42.9% accuracy for disengagement were achieved. We compared these results to outcomes from different models: AdaBoost, decision tree, k-Nearest Neighbor, naïve Bayes, neural network, and support vector machine. We showed that using a multisensor approach achieved higher accuracy than using features from any reduced set of sensors. We found that using high-level handpicked features can improve the classification accuracy in every sensor mode. Our approach is robust to both sensor fallout and occlusions. The single most important sensor feature to the classification of engagement and distraction was shown to be eye gaze. It has been shown that we can accurately predict the level of engagement of students with learning disabilities in a real-time approach that is not subject to inter-rater reliability, human observation or reliant on a single mode of sensor input. This will help teachers design interventions for a heterogeneous group of students, where teachers cannot possibly attend to each of their individual needs. Our approach can be used to identify those with the greatest learning challenges so that all students are supported to reach their full potential.

Keywords: Affective computing in education, affect detection, continuous performance test, engagement, flow, HCI, interaction, learning disabilities, machine learning, multimodal, multisensor, physiological sensors, Signal Detection Theory, student engagement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1262
3810 The Modification of the Mixed Flow Pump with Respect to Stability of the Head Curve

Authors: Roman Klas, František Pochylý, Pavel Rudolf

Abstract:

This paper is focused on the CFD simulation of the radiaxial pump (i.e. mixed flow pump) with the aim to detect the reasons of Y-Q characteristic instability. The main reasons of pressure pulsations were detected by means of the analysis of velocity and pressure fields within the pump combined with the theoretical approach. Consequently, the modifications of spiral case and pump suction area were made based on the knowledge of flow conditions and the shape of dissipation function. The primary design of pump geometry was created as the base model serving for the comparison of individual modification influences. The basic experimental data are available for this geometry. This approach replaced the more complicated and with respect to convergence of all computational tasks more difficult calculation for the compressible liquid flow. The modification of primary pump consisted in inserting the three fins types. Subsequently, the evaluation of pressure pulsations, specific energy curves and visualization of velocity fields were chosen as the criterion for successful design. 

Keywords: CFD, radiaxial pump, spiral case, stability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1573
3809 A Characterized and Optimized Approach for End-to-End Delay Constrained QoS Routing

Authors: P.S.Prakash, S.Selvan

Abstract:

QoS Routing aims to find paths between senders and receivers satisfying the QoS requirements of the application which efficiently using the network resources and underlying routing algorithm to be able to find low-cost paths that satisfy given QoS constraints. The problem of finding least-cost routing is known to be NP hard or complete and some algorithms have been proposed to find a near optimal solution. But these heuristics or algorithms either impose relationships among the link metrics to reduce the complexity of the problem which may limit the general applicability of the heuristic, or are too costly in terms of execution time to be applicable to large networks. In this paper, we analyzed two algorithms namely Characterized Delay Constrained Routing (CDCR) and Optimized Delay Constrained Routing (ODCR). The CDCR algorithm dealt an approach for delay constrained routing that captures the trade-off between cost minimization and risk level regarding the delay constraint. The ODCR which uses an adaptive path weight function together with an additional constraint imposed on the path cost, to restrict search space and hence ODCR finds near optimal solution in much quicker time.

Keywords: QoS, Delay, Routing, Optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1274
3808 Adaptive Network Intrusion Detection Learning: Attribute Selection and Classification

Authors: Dewan Md. Farid, Jerome Darmont, Nouria Harbi, Nguyen Huu Hoa, Mohammad Zahidur Rahman

Abstract:

In this paper, a new learning approach for network intrusion detection using naïve Bayesian classifier and ID3 algorithm is presented, which identifies effective attributes from the training dataset, calculates the conditional probabilities for the best attribute values, and then correctly classifies all the examples of training and testing dataset. Most of the current intrusion detection datasets are dynamic, complex and contain large number of attributes. Some of the attributes may be redundant or contribute little for detection making. It has been successfully tested that significant attribute selection is important to design a real world intrusion detection systems (IDS). The purpose of this study is to identify effective attributes from the training dataset to build a classifier for network intrusion detection using data mining algorithms. The experimental results on KDD99 benchmark intrusion detection dataset demonstrate that this new approach achieves high classification rates and reduce false positives using limited computational resources.

Keywords: Attributes selection, Conditional probabilities, information gain, network intrusion detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2698
3807 The Advantages of Integration for Social Systems – Evidence from the Automobile Industry

Authors: Waldemiro Francisco Sorte Junior

Abstract:

The Japanese integrative approach to social systems can be observed in supply chain management as well as in the relationship between public and private sectors. Both the Lean Production System and the Developmental State Model are characterized by efforts towards the achievement of mutual goals, resulting in initiatives for capacity building which emphasize the system level. In Brazil, although organizations undertake efforts to build capabilities at the individual and organizational levels, the system level is being neglected. Fieldwork data confirmed the findings of other studies in terms of the lack of integration in supply chain management in the Brazilian automobile industry. Moreover, due to the absence of an active role of the Brazilian state in its relationship with the private sector, automakers are not fully exploiting the opportunities in the domestic and regional markets. For promoting a higher level of economic growth as well as to increase the degree of spill-over of technologies and techniques, a more integrative approach is needed.

Keywords: Integration, Lean Production System, DevelopmentalState Model, Brazilian automobile industry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1528
3806 Nonlinear Dynamic Analysis of Base-Isolated Structures Using a Partitioned Solution Approach and an Exponential Model

Authors: Nicolò Vaiana, Filip C. Filippou, Giorgio Serino

Abstract:

The solution of the nonlinear dynamic equilibrium equations of base-isolated structures adopting a conventional monolithic solution approach, i.e. an implicit single-step time integration method employed with an iteration procedure, and the use of existing nonlinear analytical models, such as differential equation models, to simulate the dynamic behavior of seismic isolators can require a significant computational effort. In order to reduce numerical computations, a partitioned solution method and a one dimensional nonlinear analytical model are presented in this paper. A partitioned solution approach can be easily applied to base-isolated structures in which the base isolation system is much more flexible than the superstructure. Thus, in this work, the explicit conditionally stable central difference method is used to evaluate the base isolation system nonlinear response and the implicit unconditionally stable Newmark’s constant average acceleration method is adopted to predict the superstructure linear response with the benefit in avoiding iterations in each time step of a nonlinear dynamic analysis. The proposed mathematical model is able to simulate the dynamic behavior of seismic isolators without requiring the solution of a nonlinear differential equation, as in the case of widely used differential equation model. The proposed mixed explicit-implicit time integration method and nonlinear exponential model are adopted to analyze a three dimensional seismically isolated structure with a lead rubber bearing system subjected to earthquake excitation. The numerical results show the good accuracy and the significant computational efficiency of the proposed solution approach and analytical model compared to the conventional solution method and mathematical model adopted in this work. Furthermore, the low stiffness value of the base isolation system with lead rubber bearings allows to have a critical time step considerably larger than the imposed ground acceleration time step, thus avoiding stability problems in the proposed mixed method.

Keywords: Base-isolated structures, earthquake engineering, mixed time integration, nonlinear exponential model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1583
3805 Spatial-Temporal Awareness Approach for Extensive Re-Identification

Authors: Tyng-Rong Roan, Fuji Foo, Wenwey Hseush

Abstract:

Recent development of AI and edge computing plays a critical role to capture meaningful events such as detection of an unattended bag. One of the core problems is re-identification across multiple CCTVs. Immediately following the detection of a meaningful event is to track and trace the objects related to the event. In an extensive environment, the challenge becomes severe when the number of CCTVs increases substantially, imposing difficulties in achieving high accuracy while maintaining real-time performance. The algorithm that re-identifies cross-boundary objects for extensive tracking is referred to Extensive Re-Identification, which emphasizes the issues related to the complexity behind a great number of CCTVs. The Spatial-Temporal Awareness approach challenges the conventional thinking and concept of operations which is labor intensive and time consuming. The ability to perform Extensive Re-Identification through a multi-sensory network provides the next-level insights – creating value beyond traditional risk management.

Keywords: Long-short-term memory, re-identification, security critical application, spatial-temporal awareness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 532
3804 Response Quality Evaluation in Heterogeneous Question Answering System: A Black-box Approach

Authors: Goh Ong Sing, C. Ardil, Wilson Wong, Shahrin Sahib

Abstract:

The evaluation of the question answering system is a major research area that needs much attention. Before the rise of domain-oriented question answering systems based on natural language understanding and reasoning, evaluation is never a problem as information retrieval-based metrics are readily available for use. However, when question answering systems began to be more domains specific, evaluation becomes a real issue. This is especially true when understanding and reasoning is required to cater for a wider variety of questions and at the same time achieve higher quality responses The research in this paper discusses the inappropriateness of the existing measure for response quality evaluation and in a later part, the call for new standard measures and the related considerations are brought forward. As a short-term solution for evaluating response quality of heterogeneous systems, and to demonstrate the challenges in evaluating systems of different nature, this research presents a black-box approach using observation, classification scheme and a scoring mechanism to assess and rank three example systems (i.e. AnswerBus, START and NaLURI).

Keywords: Evaluation, question answering, response quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1859
3803 Water Resources Vulnerability Assessment to Climate Change in a Semi-Arid Basin of South India

Authors: K. Shimola, M. Krishnaveni

Abstract:

This paper examines vulnerability assessment of water resources in a semi-arid basin using the 4-step approach. The vulnerability assessment framework is developed to study the water resources vulnerability which includes the creation of GIS-based vulnerability maps. These maps represent the spatial variability of the vulnerability index. This paper introduces the 4-step approach to assess vulnerability that incorporates a new set of indicators. The approach is demonstrated using a framework composed of a precipitation data for (1975–2010) period, temperature data for (1965–2010) period, hydrological model outputs and the water resources GIS data base. The vulnerability assessment is a function of three components such as exposure, sensitivity and adaptive capacity. The current water resources vulnerability is assessed using GIS based spatio-temporal information. Rainfall Coefficient of Variation, monsoon onset and end date, rainy days, seasonality indices, temperature are selected for the criterion ‘exposure’. Water yield, ground water recharge, evapotranspiration (ET) are selected for the criterion ‘sensitivity’. Type of irrigation and storage structures are selected for the criterion ‘Adaptive capacity’. These indicators were mapped and integrated in GIS environment using overlay analysis. The five sub-basins, namely Arjunanadhi, Kousiganadhi, Sindapalli-Uppodai and Vallampatti Odai, fall under medium vulnerability profile, which indicates that the basin is under moderate stress of water resources. The paper also explores prioritization of sub-basinwise adaptation strategies to climate change based on the vulnerability indices.

Keywords: Adaptive capacity, exposure, overlay analysis, sensitivity, vulnerability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1124
3802 Surface Thermodynamics Approach to Mycobacterium tuberculosis (M-TB) – Human Sputum Interactions

Authors: J. L. Chukwuneke, C. H. Achebe, S. N. Omenyi

Abstract:

This research work presents the surface thermodynamics approach to M-TB/HIV-Human sputum interactions. This involved the use of the Hamaker coefficient concept as a surface energetics tool in determining the interaction processes, with the surface interfacial energies explained using van der Waals concept of particle interactions. The Lifshitz derivation for van der Waals forces was applied as an alternative to the contact angle approach which has been widely used in other biological systems. The methodology involved taking sputum samples from twenty infected persons and from twenty uninfected persons for absorbance measurement using a digital Ultraviolet visible Spectrophotometer. The variables required for the computations with the Lifshitz formula were derived from the absorbance data. The Matlab software tools were used in the mathematical analysis of the data produced from the experiments (absorbance values). The Hamaker constants and the combined Hamaker coefficients were obtained using the values of the dielectric constant together with the Lifshitz Equation. The absolute combined Hamaker coefficients A132abs and A131abs on both infected and uninfected sputum samples gave the values of A132abs = 0.21631x10-21Joule for M-TB infected sputum and Ã132abs = 0.18825x10-21Joule for M-TB/HIV infected sputum. The significance of this result is the positive value of the absolute combined Hamaker coefficient which suggests the existence of net positive van der waals forces demonstrating an attraction between the bacteria and the macrophage. This however, implies that infection can occur. It was also shown that in the presence of HIV, the interaction energy is reduced by 13% conforming adverse effects observed in HIV patients suffering from tuberculosis.

Keywords: Absorbance, dielectric constant, Hamaker coefficient, Lifshitz formula, macrophage, Mycobacterium tuberculosis, Van der Waals forces.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1775
3801 Thermal Stability and Crystallization Behaviour of Modified ABS/PP Nanocomposites

Authors: Marianna I. Triantou, Petroula A. Tarantili

Abstract:

In this research work, poly (acrylonitrile-butadienestyrene)/ polypropylene (ABS/PP) blends were processed by melt compounding in a twin-screw extruder. Upgrading of the thermal characteristics of the obtained materials was attempted by the incorporation of organically modified montmorillonite (OMMT), as well as, by the addition of two types of compatibilizers; polypropylene grafted with maleic anhydride (PP-g-MAH) and ABS grafted with maleic anhydride (ABS-g-MAH). The effect of the above treatments was investigated separately and in combination. Increasing the PP content in ABS matrix seems to increase the thermal stability of their blend and the glass transition temperature (Tg) of SAN phase of ABS. From the other part, the addition of ABS to PP promotes the formation of its β-phase, which is maximum at 30 wt% ABS concentration, and increases the crystallization temperature (Tc) of PP. In addition, it increases the crystallization rate of PP.The β-phase of PP in ABS/PP blends is reduced by the addition of compatibilizers or/and organoclay reinforcement. The incorporation of compatibilizers increases the thermal stability of PP and reduces its melting (ΔΗm) and crystallization (ΔΗc) enthalpies. Furthermore it decreases slightly the Tgs of PP and SAN phases of ABS/PP blends. Regarding the storage modulus of the ABS/PP blends, it presents a change in their behavior at about 10°C and return to their initial behavior at ~110°C. The incorporation of OMMT to no compatibilized and compatibilized ABS/PP blends enhances their storage modulus.

Keywords: Acrylonitrile, butadiene, styrene terpolymer, compatibilizer, organoclay, polypropylene.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2908
3800 Game-Tree Simplification by Pattern Matching and Its Acceleration Approach using an FPGA

Authors: Suguru Ochiai, Toru Yabuki, Yoshiki Yamaguchi, Yuetsu Kodama

Abstract:

In this paper, we propose a Connect6 solver which adopts a hybrid approach based on a tree-search algorithm and image processing techniques. The solver must deal with the complicated computation and provide high performance in order to make real-time decisions. The proposed approach enables the solver to be implemented on a single Spartan-6 XC6SLX45 FPGA produced by XILINX without using any external devices. The compact implementation is achieved through image processing techniques to optimize a tree-search algorithm of the Connect6 game. The tree search is widely used in computer games and the optimal search brings the best move in every turn of a computer game. Thus, many tree-search algorithms such as Minimax algorithm and artificial intelligence approaches have been widely proposed in this field. However, there is one fundamental problem in this area; the computation time increases rapidly in response to the growth of the game tree. It means the larger the game tree is, the bigger the circuit size is because of their highly parallel computation characteristics. Here, this paper aims to reduce the size of a Connect6 game tree using image processing techniques and its position symmetric property. The proposed solver is composed of four computational modules: a two-dimensional checkmate strategy checker, a template matching module, a skilful-line predictor, and a next-move selector. These modules work well together in selecting next moves from some candidates and the total amount of their circuits is small. The details of the hardware design for an FPGA implementation are described and the performance of this design is also shown in this paper.

Keywords: Connect6, pattern matching, game-tree reduction, hardware direct computation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1974