Search results for: Requirement Analysis Modal.
8488 Analysis of Heart Beat Dynamics through Singularity Spectrum
Authors: Harish Kumar, Hussein Yahia, Oriol Pont, Michel Haissaguerre, Nicolas Derval, Meleze Hocini
Abstract:
The analysis to detect arrhythmias and life-threatening conditions are highly essential in today world and this analysis can be accomplished by advanced non-linear processing methods for accurate analysis of the complex signals of heartbeat dynamics. In this perspective, recent developments in the field of multiscale information content have lead to the Microcanonical Multiscale Formalism (MMF). We show that such framework provides several signal analysis techniques that are especially adapted to the study of heartbeat dynamics. In this paper, we just show first hand results of whether the considered heartbeat dynamics signals have the multiscale properties by computing local preticability exponents (LPEs) and the Unpredictable Points Manifold (UPM), and thereby computing the singularity spectrum.Keywords: Microcanonical Multiscale Formalism (MMF), UnpredictablePoints Manifold (UPM), Heartbeat Dynamics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15188487 Automatic Detection of Defects in Ornamental Limestone Using Wavelets
Authors: Maria C. Proença, Marco Aniceto, Pedro N. Santos, José C. Freitas
Abstract:
A methodology based on wavelets is proposed for the automatic location and delimitation of defects in limestone plates. Natural defects include dark colored spots, crystal zones trapped in the stone, areas of abnormal contrast colors, cracks or fracture lines, and fossil patterns. Although some of these may or may not be considered as defects according to the intended use of the plate, the goal is to pair each stone with a map of defects that can be overlaid on a computer display. These layers of defects constitute a database that will allow the preliminary selection of matching tiles of a particular variety, with specific dimensions, for a requirement of N square meters, to be done on a desktop computer rather than by a two-hour search in the storage park, with human operators manipulating stone plates as large as 3 m x 2 m, weighing about one ton. Accident risks and work times are reduced, with a consequent increase in productivity. The base for the algorithm is wavelet decomposition executed in two instances of the original image, to detect both hypotheses – dark and clear defects. The existence and/or size of these defects are the gauge to classify the quality grade of the stone products. The tuning of parameters that are possible in the framework of the wavelets corresponds to different levels of accuracy in the drawing of the contours and selection of the defects size, which allows for the use of the map of defects to cut a selected stone into tiles with minimum waste, according the dimension of defects allowed.
Keywords: Automatic detection, wavelets, defects, fracture lines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11668486 Sewer Culvert Installation Method to Accommodate Underground Construction in an Urban Area with Narrow Streets (The Development of Shield Switching Type Micro-Tunneling Method and the Introduction of Construction Examples)
Authors: Osamu Igawa, Hiroshi Kouchiwa, Yuji Ito
Abstract:
In recent years, a reconstruction project for sewer pipelines has been progressing in Japan with the aim of renewing old sewer culverts. However, it is difficult to secure a sufficient base area for shafts in an urban area because many streets are narrow with a complex layout. As a result, construction in such urban areas is generally very demanding. In urban areas, there is a strong requirement for a safe, reliable and economical construction method that does not disturb the public’s daily life and urban activities. With this in mind, we developed a new construction method called the “shield switching type micro-tunneling method,” which integrates the micro-tunneling method and shield method. In this method, pipeline is constructed first for sections that are gently curved or straight using the economical micro-tunneling method, and then the method is switched to the shield method for sections with a sharp curve or a series of curves without establishing an intermediate shaft. This paper provides the information, features and construction examples of this newly developed method.
Keywords: Micro-tunneling method, Secondary lining applied RC segment, Sharp curve, Shield method, Switching type.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21428485 Performance Evaluation of AOMDV-PAMAC Protocols for Ad Hoc Networks
Authors: B. Malarkodi, S. K. Riyaz Hussain, B. Venkataramani
Abstract:
Power consumption of nodes in ad hoc networks is a critical issue as they predominantly operate on batteries. In order to improve the lifetime of an ad hoc network, all the nodes must be utilized evenly and the power required for connections must be minimized. In this project a link layer algorithm known as Power Aware medium Access Control (PAMAC) protocol is proposed which enables the network layer to select a route with minimum total power requirement among the possible routes between a source and a destination provided all nodes in the routes have battery capacity above a threshold. When the battery capacity goes below a predefined threshold, routes going through these nodes will be avoided and these nodes will act only as source and destination. Further, the first few nodes whose battery power drained to the set threshold value are pushed to the exterior part of the network and the nodes in the exterior are brought to the interior. Since less total power is required to forward packets for each connection. The network layer protocol AOMDV is basically an extension to the AODV routing protocol. AOMDV is designed to form multiple routes to the destination and it also avoid the loop formation so that it reduces the unnecessary congestion to the channel. In this project, the performance of AOMDV is evaluated using PAMAC as a MAC layer protocol and the average power consumption, throughput and average end to end delay of the network are calculated and the results are compared with that of the other network layer protocol AODV.Keywords: AODV, PAMAC, AOMDV, Power consumption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18258484 Future of Electric Power Generation Technologies: Environmental and Economic Comparison
Authors: Abdulrahman A. Bahaddad, Mohammed Beshir
Abstract:
The objective of this paper is to demonstrate and describe eight different types of power generation technologies and to understand the history and future trends of each technology. In addition, a comparative analysis between these technologies will be presented with respect to their cost analysis and associated performance.
Keywords: Conventional power generation, economic analysis, environmental impact, renewable energy power generation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12538483 Penetration Analysis for Composites Applicable to Military Vehicle Armors, Aircraft Engines and Nuclear Power Plant Structures
Authors: Dong Wook Lee
Abstract:
This paper describes a method for analyzing penetration for composite material using an explicit nonlinear Finite Element Analysis (FEA). This method may be used in the early stage of design for the protection of military vehicles, aircraft engines and nuclear power plant structures made of composite materials. This paper deals with simple ballistic penetration tests for composite materials and the FEA modeling method and results. The FEA was performed to interpret the ballistic field test phenomenon regarding the damage propagation in the structure subjected to local foreign object impact.
Keywords: Computer Aided Engineering, CAE, Finite Element Analysis, FEA, impact analysis, penetration analysis, composite material.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6118482 Coupled Multifield Analysis of Piezoelectrically Actuated Microfluidic Device for Transdermal Drug Delivery Applications
Authors: Muhammad Waseem Ashraf, Shahzadi Tayyaba, Nitin Afzulpurkar, Asim Nisar, Adisorn Tuantranont, Erik L J Bohez
Abstract:
In this paper, design, fabrication and coupled multifield analysis of hollow out-of-plane silicon microneedle array with piezoelectrically actuated microfluidic device for transdermal drug delivery (TDD) applications is presented. The fabrication process of silicon microneedle array is first done by series of combined isotropic and anisotropic etching processes using inductively coupled plasma (ICP) etching technology. Then coupled multifield analysis of MEMS based piezoelectrically actuated device with integrated 2×2 silicon microneedle array is presented. To predict the stress distribution and model fluid flow in coupled field analysis, finite element (FE) and computational fluid dynamic (CFD) analysis using ANSYS rather than analytical systems has been performed. Static analysis and transient CFD analysis were performed to predict the fluid flow through the microneedle array. The inlet pressure from 10 kPa to 150 kPa was considered for static CFD analysis. In the lumen region fluid flow rate 3.2946 μL/min is obtained at 150 V for 2×2 microneedle array. In the present study the authors have performed simulation of structural, piezoelectric and CFD analysis on three dimensional model of the piezoelectrically actuated mcirofluidic device integrated with 2×2 microneedle array.Keywords: Coupled multifield, finite element analysis, hollow silicon microneedle, transdermal drug delivery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18548481 Knowledge Based Concept Analysis Method using Concept Maps and UML: Security Notion Case
Authors: Miquel Colobran, Josep M. Basart
Abstract:
One of the most ancient humankind concerns is knowledge formalization i.e. what a concept is. Concept Analysis, a branch of analytical philosophy, relies on the purpose of decompose the elements, relations and meanings of a concept. This paper aims at presenting a method to make a concept analysis obtaining a knowledge representation suitable to be processed by a computer system using either object-oriented or ontology technologies. Security notion is, usually, known as a set of different concepts related to “some kind of protection". Our method concludes that a more general framework for the concept, despite it is dynamic, is possible and any particular definition (instantiation) depends on the elements used by its construction instead of the concept itself.
Keywords: Concept analysis, Knowledge representation, Security, UML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23908480 A Microcontroller Implementation of Constrained Model Predictive Control
Authors: Amira Kheriji Abbes, Faouzi Bouani, Mekki Ksouri
Abstract:
Model Predictive Control (MPC) is an established control technique in a wide range of process industries. The reason for this success is its ability to handle multivariable systems and systems having input, output or state constraints. Neverthless comparing to PID controller, the implementation of the MPC in miniaturized devices like Field Programmable Gate Arrays (FPGA) and microcontrollers has historically been very small scale due to its complexity in implementation and its computation time requirement. At the same time, such embedded technologies have become an enabler for future manufacturing enterprisers as well as a transformer of organizations and markets. In this work, we take advantage of these recent advances in this area in the deployment of one of the most studied and applied control technique in the industrial engineering. In this paper, we propose an efficient firmware for the implementation of constrained MPC in the performed STM32 microcontroller using interior point method. Indeed, performances study shows good execution speed and low computational burden. These results encourage to develop predictive control algorithms to be programmed in industrial standard processes. The PID anti windup controller was also implemented in the STM32 in order to make a performance comparison with the MPC. The main features of the proposed constrained MPC framework are illustrated through two examples.Keywords: Embedded software, microcontroller, constrainedModel Predictive Control, interior point method, PID antiwindup, Keil tool, C/Cµ language.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27988479 Building Relationship Network for Machine Analysis from Wear Debris Measurements
Authors: Qurban A Memon, Mohammad S. Laghari
Abstract:
Integration of system process information obtained through an image processing system with an evolving knowledge database to improve the accuracy and predictability of wear debris analysis is the main focus of the paper. The objective is to automate intelligently the analysis process of wear particle using classification via self-organizing maps. This is achieved using relationship measurements among corresponding attributes of various measurements for wear debris. Finally, visualization technique is proposed that helps the viewer in understanding and utilizing these relationships that enable accurate diagnostics.Keywords: Relationship Network, Relationship Measurement, Self-organizing Clusters, Wear Debris Analysis, Kohonen Network
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19398478 Monetary Evaluation of Dispatching Decisions in Consideration of Mode Choice Models
Authors: Marcel Schneider, Nils Nießen
Abstract:
Microscopic simulation tool kits allow for consideration of the two processes of railway operations and the previous timetable production. Block occupation conflicts on both process levels are often solved by using defined train priorities. These conflict resolutions (dispatching decisions) generate reactionary delays to the involved trains. The sum of reactionary delays is commonly used to evaluate the quality of railway operations, which describes the timetable robustness. It is either compared to an acceptable train performance or the delays are appraised economically by linear monetary functions. It is impossible to adequately evaluate dispatching decisions without a well-founded objective function. This paper presents a new approach for the evaluation of dispatching decisions. The approach uses mode choice models and considers the behaviour of the end-customers. These models evaluate the reactionary delays in more detail and consider other competing modes of transport. The new approach pursues the coupling of a microscopic model of railway operations with the macroscopic choice mode model. At first, it will be implemented for railway operations process but it can also be used for timetable production. The evaluation considers the possibility for the customer to interchange to other transport modes. The new approach starts to look at rail and road, but it can also be extended to air travel. The result of mode choice models is the modal split. The reactions by the end-customers have an impact on the revenue of the train operating companies. Different purposes of travel have different payment reserves and tolerances towards late running. Aside from changes to revenues, longer journey times can also generate additional costs. The costs are either time- or track-specific and arise from required changes to rolling stock or train crew cycles. Only the variable values are summarised in the contribution margin, which is the base for the monetary evaluation of delays. The contribution margin is calculated for different possible solutions to the same conflict. The conflict resolution is optimised until the monetary loss becomes minimal. The iterative process therefore determines an optimum conflict resolution by monitoring the change to the contribution margin. Furthermore, a monetary value of each dispatching decision can also be derived.Keywords: Choice of mode, monetary evaluation, railway operations, reactionary delays.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14818477 Comparative Study of Dynamic Effect on Analysis Approaches for Circular Tanks Using Codal Provisions
Authors: P. Deepak Kumar, Aishwarya Alok, P. R. Maiti
Abstract:
Liquid storage tanks have become widespread during the recent decades due to their extensive usage. Analysis of liquid containing tanks is known to be complex due to hydrodynamic force exerted on tank which makes the analysis a complex one. The objective of this research is to carry out analysis of liquid domain along with structural interaction for various geometries of circular tanks considering seismic effects. An attempt has been made to determine hydrodynamic pressure distribution on the tank wall considering impulsive and convective components of liquid mass. To get a better picture, a comparative study of Draft IS 1893 Part 2, ACI 350.3 and Eurocode 8 for Circular Shaped Tank has been performed. Further, the differences in the magnitude of shear and moment at base as obtained from static (IS 3370 IV) and dynamic (Draft IS 1892 Part 2) analysis of ground supported circular tank highlight the need for us to mature from the old code to a newer code, which is more accurate and reliable.Keywords: Liquid filled containers, Circular Tanks, IS 1893 (Part 2), Seismic analysis, Sloshing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14408476 Application of Multi-Dimensional Principal Component Analysis to Medical Data
Authors: Naoki Yamamoto, Jun Murakami, Chiharu Okuma, Yutaro Shigeto, Satoko Saito, Takashi Izumi, Nozomi Hayashida
Abstract:
Multi-dimensional principal component analysis (PCA) is the extension of the PCA, which is used widely as the dimensionality reduction technique in multivariate data analysis, to handle multi-dimensional data. To calculate the PCA the singular value decomposition (SVD) is commonly employed by the reason of its numerical stability. The multi-dimensional PCA can be calculated by using the higher-order SVD (HOSVD), which is proposed by Lathauwer et al., similarly with the case of ordinary PCA. In this paper, we apply the multi-dimensional PCA to the multi-dimensional medical data including the functional independence measure (FIM) score, and describe the results of experimental analysis.Keywords: multi-dimensional principal component analysis, higher-order SVD (HOSVD), functional independence measure (FIM), medical data, tensor decomposition
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25028475 Comparative Analysis of the Public Funding for Greek Universities: An Ordinal DEA/MCDM Approach
Authors: Yiannis Smirlis, Dimitris K. Despotis
Abstract:
This study performs a comparative analysis of the 21 Greek Universities in terms of their public funding, awarded for covering their operating expenditure. First it introduces a DEA/MCDM model that allocates the fund into four expenditure factors in the most favorable way for each university. Then, it presents a common, consensual assessment model to reallocate the amounts, remaining in the same level of total public budget. From the analysis it derives that a number of universities cannot justify the public funding in terms of their size and operational workload. For them, the sufficient reduction of their public funding amount is estimated as a future target. Due to the lack of precise data for a number of expenditure criteria, the analysis is based on a mixed crisp-ordinal data set.Keywords: Data envelopment analysis, Greek universities, operating expenditures, ordinal data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17668474 CFD Analysis of Two Phase Flow in a Horizontal Pipe – Prediction of Pressure Drop
Authors: P. Bhramara, V. D. Rao, K. V. Sharma , T. K. K. Reddy
Abstract:
In designing of condensers, the prediction of pressure drop is as important as the prediction of heat transfer coefficient. Modeling of two phase flow, particularly liquid – vapor flow under diabatic conditions inside a horizontal tube using CFD analysis is difficult with the available two phase models in FLUENT due to continuously changing flow patterns. In the present analysis, CFD analysis of two phase flow of refrigerants inside a horizontal tube of inner diameter, 0.0085 m and 1.2 m length is carried out using homogeneous model under adiabatic conditions. The refrigerants considered are R22, R134a and R407C. The analysis is performed at different saturation temperatures and at different flow rates to evaluate the local frictional pressure drop. Using Homogeneous model, average properties are obtained for each of the refrigerants that is considered as single phase pseudo fluid. The so obtained pressure drop data is compared with the separated flow models available in literature.Keywords: Adiabatic conditions, CFD analysis, Homogeneousmodel and Liquid – Vapor flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36978473 Sustainable Development of Medium Strength Concrete Using Polypropylene as Aggregate Replacement
Authors: Reza Keihani, Ali Bahadori-Jahromi, Timothy James Clacy
Abstract:
Plastic as an environmental burden is a well-rehearsed topic in the research area. This is due to its global demand and destructive impacts on the environment, which has been a significant concern to the governments. Typically, the use of plastic in the construction industry is seen across low-density, non-structural applications due to its diverse range of benefits including high strength-to-weight ratios, manipulability and durability. It can be said that with the level of plastic consumption experienced in the construction industry, an ongoing responsibility is shown for this sector to continually innovate alternatives for application of recycled plastic waste such as using plastic made replacement from polyethylene, polystyrene, polyvinyl and polypropylene in the concrete mix design. In this study, the impact of partially replaced fine aggregate with polypropylene in the concrete mix design was investigated to evaluate the concrete’s compressive strength by conducting an experimental work which comprises of six concrete mix batches with polypropylene replacements ranging from 0.5 to 3.0%. The results demonstrated a typical decline in the compressive strength with the addition of plastic aggregate, despite this reduction generally mitigated as the level of plastic in the concrete mix increased. Furthermore, two of the six plastic-containing concrete mixes tested in the current study exceeded the ST5 standardised prescribed concrete mix compressive strength requirement at 28-days containing 1.50% and 2.50% plastic aggregates, which demonstrated the potential for use of recycled polypropylene in structural applications, as a partial by mass, fine aggregate replacement in the concrete mix.
Keywords: Compressive strength, concrete, polypropylene, sustainability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9428472 Routing Medical Images with Tabu Search and Simulated Annealing: A Study on Quality of Service
Authors: Mejía M. Paula, Ramírez L. Leonardo, Puerta A. Gabriel
Abstract:
In telemedicine, the image repository service is important to increase the accuracy of diagnostic support of medical personnel. This study makes comparison between two routing algorithms regarding the quality of service (QoS), to be able to analyze the optimal performance at the time of loading and/or downloading of medical images. This study focused on comparing the performance of Tabu Search with other heuristic and metaheuristic algorithms that improve QoS in telemedicine services in Colombia. For this, Tabu Search and Simulated Annealing heuristic algorithms are chosen for their high usability in this type of applications; the QoS is measured taking into account the following metrics: Delay, Throughput, Jitter and Latency. In addition, routing tests were carried out on ten images in digital image and communication in medicine (DICOM) format of 40 MB. These tests were carried out for ten minutes with different traffic conditions, reaching a total of 25 tests, from a server of Universidad Militar Nueva Granada (UMNG) in Bogotá-Colombia to a remote user in Universidad de Santiago de Chile (USACH) - Chile. The results show that Tabu search presents a better QoS performance compared to Simulated Annealing, managing to optimize the routing of medical images, a basic requirement to offer diagnostic images services in telemedicine.
Keywords: Medical image, QoS, simulated annealing, Tabu search, telemedicine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9568471 Meta-Analysis of the Impact of Positive Psychological Capital on Employees Outcomes: The Moderating Role of Tenure
Authors: Hyeondal Jeong, Yoonjung Baek
Abstract:
This research examines the effects of positive psychological capital (or PsyCap) on employee’s outcomes (satisfaction, commitment, organizational citizenship behavior, innovation behavior and individual creativity). This study conducted a meta-analysis of articles published in the Republic of Korea. As a result, positive psychological capital has a positive effect on the behavior of employees. Heterogeneity was identified among the studies included in the analysis and the context factors were analyzed; the study proposes contextual factors such as team tenure. The moderating effect of team tenure was not statistically significant. The implications were discussed based on the analysis results.
Keywords: Positive psychological capital, satisfaction, commitment, OCB, creativity, meta-analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16898470 Using HABIT to Establish the Chemicals Analysis Methodology for Maanshan Nuclear Power Plant
Authors: J. R. Wang, S. W. Chen, Y. Chiang, W. S. Hsu, J. H. Yang, Y. S. Tseng, C. Shih
Abstract:
In this research, the HABIT analysis methodology was established for Maanshan nuclear power plant (NPP). The Final Safety Analysis Report (FSAR), reports, and other data were used in this study. To evaluate the control room habitability under the CO2 storage burst, the HABIT methodology was used to perform this analysis. The HABIT result was below the R.G. 1.78 failure criteria. This indicates that Maanshan NPP habitability can be maintained. Additionally, the sensitivity study of the parameters (wind speed, atmospheric stability classification, air temperature, and control room intake flow rate) was also performed in this research.
Keywords: PWR, HABIT, habitability, Maanshan.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9108469 Dynamic Analysis of Transmission Line Towers
Authors: Srikanth L., Neelima Satyam D.
Abstract:
The transmission line towers are one of the important life line structures in the distribution of power from the source to the various places for several purposes. The predominant external loads which act on these towers are wind and earthquake loads. In this present study tower is analyzed using Indian Standards IS: 875:1987(Wind Load), IS: 802:1995(Structural steel), IS:1893:2002 (Earthquake) and dynamic analysis of tower has been performed considering ground motion of 2001 Bhuj Earthquake (India). The dynamic analysis was performed considering a tower system consisting two towers spaced 800m apart and 35m height each. This analysis has been performed using numerical time stepping finite difference method which is central difference method were employed by a developed MATLAB program to get the normalized ground motion parameters includes acceleration, frequency, velocity which are important in designing the tower. The tower is analyzed using response spectrum analysis.
Keywords: Response Spectra, Dynamic Analysis, Central Difference Method, Transmission Tower.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 40888468 Efficiency of the Slovak Commercial Banks Applying the DEA Window Analysis
Authors: Iveta Řepková
Abstract:
The aim of this paper is to estimate the efficiency of the Slovak commercial banks employing the Data Envelopment Analysis (DEA) window analysis approach during the period 2003-2012. The research is based on unbalanced panel data of the Slovak commercial banks. Undesirable output was included into analysis of banking efficiency. It was found that most efficient banks were Postovabanka, UniCredit Bank and Istrobanka in CCR model and the most efficient banks were Slovenskasporitelna, Istrobanka and UniCredit Bank in BCC model. On contrary, the lowest efficient banks were found Privatbanka and CitiBank. We found that the largest banks in the Slovak banking market were lower efficient than medium-size and small banks. Results of the paper is that during the period 2003-2008 the average efficiency was increasing and then during the period 2010-2011 the average efficiency decreased as a result of financial crisis.
Keywords: Data Envelopment Analysis, efficiency, Slovak banking sector, window analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26558467 Non-negative Principal Component Analysis for Face Recognition
Abstract:
Principle component analysis is often combined with the state-of-art classification algorithms to recognize human faces. However, principle component analysis can only capture these features contributing to the global characteristics of data because it is a global feature selection algorithm. It misses those features contributing to the local characteristics of data because each principal component only contains some levels of global characteristics of data. In this study, we present a novel face recognition approach using non-negative principal component analysis which is added with the constraint of non-negative to improve data locality and contribute to elucidating latent data structures. Experiments are performed on the Cambridge ORL face database. We demonstrate the strong performances of the algorithm in recognizing human faces in comparison with PCA and NREMF approaches.Keywords: classification, face recognition, non-negativeprinciple component analysis (NPCA)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16958466 The Applications of Quantum Mechanics Simulation for Solvent Selection in Chemicals Separation
Authors: Attapong T., Hong-Ming Ku, Nakarin M., Narin L., Alisa L, Jirut W.
Abstract:
The quantum mechanics simulation was applied for calculating the interaction force between 2 molecules based on atomic level. For the simple extractive distillation system, it is ternary components consisting of 2 closed boiling point components (A,lower boiling point and B, higher boiling point) and solvent (S). The quantum mechanics simulation was used to calculate the intermolecular force (interaction force) between the closed boiling point components and solvents consisting of intermolecular between A-S and B-S. The requirement of the promising solvent for extractive distillation is that solvent (S) has to form stronger intermolecular force with only one component than the other component (A or B). In this study, the systems of aromatic-aromatic, aromatic-cycloparaffin, and paraffindiolefin systems were selected as the demonstration for solvent selection. This study defined new term using for screening the solvents called relative interaction force which is calculated from the quantum mechanics simulation. The results showed that relative interaction force gave the good agreement with the literature data (relative volatilities from the experiment). The reasons are discussed. Finally, this study suggests that quantum mechanics results can improve the relative volatility estimation for screening the solvents leading to reduce time and money consumingKeywords: Extractive distillation, Interaction force, Quamtum mechanic, Relative volatility, Solvent extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15938465 Input-Output Analysis in Laptop Computer Manufacturing
Authors: H. Z. Ulukan, E. Demircioğlu, M. Erol Genevois
Abstract:
The scope of this paper and the aim of proposed model were to apply monetary Input –Output (I-O) analysis to point out the importance of reusing know-how and other requirements in order to reduce the production costs in a manufacturing process for a laptop computer. I-O approach using the monetary input-output model is employed to demonstrate the impacts of different factors in a manufacturing process. A sensitivity analysis showing the correlation between these different factors is also presented. It is expected that the recommended model would have an advantageous effect in the cost minimization process.
Keywords: Input-Output Analysis, Monetary Input-Output Model, Manufacturing Process, Laptop Computer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 46278464 A Methodology to Analyze Technology Convergence: Patent-Citation Based Technology Input-Output Analysis
Authors: Jeeeun Kim, Sungjoo Lee
Abstract:
This research proposes a methodology for patent-citation-based technology input-output analysis by applying the patent information to input-output analysis developed for the dependencies among different industries. For this analysis, a technology relationship matrix and its components, as well as input and technology inducement coefficients, are constructed using patent information. Then, a technology inducement coefficient is calculated by normalizing the degree of citation from certain IPCs to the different IPCs (International patent classification) or to the same IPCs. Finally, we construct a Dependency Structure Matrix (DSM) based on the technology inducement coefficient to suggest a useful application for this methodology.
Keywords: Technology spillover effect, technology relationship, IO table, technology inducement coefficients, patent analysis, patent citation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25738463 Computational Design of Inhibitory Agents of BMP-Noggin Interaction to Promote Osteogenesis
Authors: Shaila Ahmed, Raghu Prasad Rao Metpally, Sreedhara Sangadala, Boojala Vijay B Reddy
Abstract:
Bone growth factors, such as Bone Morphogenic Protein-2 (BMP-2) have been approved by the FDA to replace grafting for some surgical interventions, but the high dose requirement limits its use in patients. Noggin, an extracellular protein, blocks the effect of BMP-2 by binding to BMP. Preventing the BMP-2/noggin interaction will help increase the free concentration of BMP-2 and therefore should enhance its efficacy to induce bone formation. The work presented here involves computational design of novel small molecule inhibitory agents of BMP-2/noggin interaction, based on our current understanding of BMP-2, and its known putative ligands (receptors and antagonists). A successful acquisition of such an inhibitory agent of BMP-2/noggin interaction would allow clinicians to reduce the dose required of BMP-2 protein in clinical applications to promote osteogenesis. The available crystal structures of the BMPs, its receptors, and the binding partner noggin were analyzed to identify the critical residues involved in their interaction. In presenting this study, LUDI de novo design method was utilized to perform virtual screening of a large number of compounds from a commercially available library against the binding sites of noggin to identify the lead chemical compounds that could potentially block BMP-noggin interaction with a high specificity.Keywords: Transforming growth factor-beta, Bone morphogenic proteins, Noggin, LUDI de novo design method, CAP small molecules.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19208462 Parametric and Nonparametric Analysis of Breast Cancer Treatments
Authors: Chunling Cong, Chris.P.Tsokos
Abstract:
The objective of the present research manuscript is to perform parametric, nonparametric, and decision tree analysis to evaluate two treatments that are being used for breast cancer patients. Our study is based on utilizing real data which was initially used in “Tamoxifen with or without breast irradiation in women of 50 years of age or older with early breast cancer" [1], and the data is supplied to us by N.A. Ibrahim “Decision tree for competing risks survival probability in breast cancer study" [2]. We agree upon certain aspects of our findings with the published results. However, in this manuscript, we focus on relapse time of breast cancer patients instead of survival time and parametric analysis instead of semi-parametric decision tree analysis is applied to provide more precise recommendations of effectiveness of the two treatments with respect to reoccurrence of breast cancer.Keywords: decision tree, breast cancer treatments, parametricanalysis, non-parametric analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20528461 Neural Network Implementation Using FPGA: Issues and Application
Authors: A. Muthuramalingam, S. Himavathi, E. Srinivasan
Abstract:
.Hardware realization of a Neural Network (NN), to a large extent depends on the efficient implementation of a single neuron. FPGA-based reconfigurable computing architectures are suitable for hardware implementation of neural networks. FPGA realization of ANNs with a large number of neurons is still a challenging task. This paper discusses the issues involved in implementation of a multi-input neuron with linear/nonlinear excitation functions using FPGA. Implementation method with resource/speed tradeoff is proposed to handle signed decimal numbers. The VHDL coding developed is tested using Xilinx XC V50hq240 Chip. To improve the speed of operation a lookup table method is used. The problems involved in using a lookup table (LUT) for a nonlinear function is discussed. The percentage saving in resource and the improvement in speed with an LUT for a neuron is reported. An attempt is also made to derive a generalized formula for a multi-input neuron that facilitates to estimate approximately the total resource requirement and speed achievable for a given multilayer neural network. This facilitates the designer to choose the FPGA capacity for a given application. Using the proposed method of implementation a neural network based application, namely, a Space vector modulator for a vector-controlled drive is presented
Keywords: FPGA implementation, multi-input neuron, neural network, nn based space vector modulator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44238460 Variance Based Component Analysis for Texture Segmentation
Authors: Zeinab Ghasemi, S. Amirhassan Monadjemi, Abbas Vafaei
Abstract:
This paper presents a comparative analysis of a new unsupervised PCA-based technique for steel plates texture segmentation towards defect detection. The proposed scheme called Variance Based Component Analysis or VBCA employs PCA for feature extraction, applies a feature reduction algorithm based on variance of eigenpictures and classifies the pixels as defective and normal. While the classic PCA uses a clusterer like Kmeans for pixel clustering, VBCA employs thresholding and some post processing operations to label pixels as defective and normal. The experimental results show that proposed algorithm called VBCA is 12.46% more accurate and 78.85% faster than the classic PCA. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19738459 Transient Analysis of a Single-Server Queue with Batch Arrivals Using Modeling and Functions Akin to the Modified Bessel Functions
Authors: Vitalice K. Oduol
Abstract:
The paper considers a single-server queue with fixedsize batch Poisson arrivals and exponential service times, a model that is useful for a buffer that accepts messages arriving as fixed size batches of packets and releases them one packet at time. Transient performance measures for queues have long been recognized as being complementary to the steady-state analysis. The focus of the paper is on the use of the functions that arise in the analysis of the transient behaviour of the queuing system. The paper exploits practical modelling to obtain a solution to the integral equation encountered in the analysis. Results obtained indicate that under heavy load conditions, there is significant disparity in the statistics between the transient and steady state values.Keywords: batch arrivals, modelling, single-server queue, time-varying probabilities, transient analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1531