Search results for: requirement analysis.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9006

Search results for: requirement analysis.

8556 Application of Multi-Dimensional Principal Component Analysis to Medical Data

Authors: Naoki Yamamoto, Jun Murakami, Chiharu Okuma, Yutaro Shigeto, Satoko Saito, Takashi Izumi, Nozomi Hayashida

Abstract:

Multi-dimensional principal component analysis (PCA) is the extension of the PCA, which is used widely as the dimensionality reduction technique in multivariate data analysis, to handle multi-dimensional data. To calculate the PCA the singular value decomposition (SVD) is commonly employed by the reason of its numerical stability. The multi-dimensional PCA can be calculated by using the higher-order SVD (HOSVD), which is proposed by Lathauwer et al., similarly with the case of ordinary PCA. In this paper, we apply the multi-dimensional PCA to the multi-dimensional medical data including the functional independence measure (FIM) score, and describe the results of experimental analysis.

Keywords: multi-dimensional principal component analysis, higher-order SVD (HOSVD), functional independence measure (FIM), medical data, tensor decomposition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2502
8555 Comparative Analysis of the Public Funding for Greek Universities: An Ordinal DEA/MCDM Approach

Authors: Yiannis Smirlis, Dimitris K. Despotis

Abstract:

This study performs a comparative analysis of the 21 Greek Universities in terms of their public funding, awarded for covering their operating expenditure. First it introduces a DEA/MCDM model that allocates the fund into four expenditure factors in the most favorable way for each university. Then, it presents a common, consensual assessment model to reallocate the amounts, remaining in the same level of total public budget. From the analysis it derives that a number of universities cannot justify the public funding in terms of their size and operational workload. For them, the sufficient reduction of their public funding amount is estimated as a future target. Due to the lack of precise data for a number of expenditure criteria, the analysis is based on a mixed crisp-ordinal data set.

Keywords: Data envelopment analysis, Greek universities, operating expenditures, ordinal data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1766
8554 CFD Analysis of Two Phase Flow in a Horizontal Pipe – Prediction of Pressure Drop

Authors: P. Bhramara, V. D. Rao, K. V. Sharma , T. K. K. Reddy

Abstract:

In designing of condensers, the prediction of pressure drop is as important as the prediction of heat transfer coefficient. Modeling of two phase flow, particularly liquid – vapor flow under diabatic conditions inside a horizontal tube using CFD analysis is difficult with the available two phase models in FLUENT due to continuously changing flow patterns. In the present analysis, CFD analysis of two phase flow of refrigerants inside a horizontal tube of inner diameter, 0.0085 m and 1.2 m length is carried out using homogeneous model under adiabatic conditions. The refrigerants considered are R22, R134a and R407C. The analysis is performed at different saturation temperatures and at different flow rates to evaluate the local frictional pressure drop. Using Homogeneous model, average properties are obtained for each of the refrigerants that is considered as single phase pseudo fluid. The so obtained pressure drop data is compared with the separated flow models available in literature.

Keywords: Adiabatic conditions, CFD analysis, Homogeneousmodel and Liquid – Vapor flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3697
8553 Sustainable Development of Medium Strength Concrete Using Polypropylene as Aggregate Replacement

Authors: Reza Keihani, Ali Bahadori-Jahromi, Timothy James Clacy

Abstract:

Plastic as an environmental burden is a well-rehearsed topic in the research area. This is due to its global demand and destructive impacts on the environment, which has been a significant concern to the governments. Typically, the use of plastic in the construction industry is seen across low-density, non-structural applications due to its diverse range of benefits including high strength-to-weight ratios, manipulability and durability. It can be said that with the level of plastic consumption experienced in the construction industry, an ongoing responsibility is shown for this sector to continually innovate alternatives for application of recycled plastic waste such as using plastic made replacement from polyethylene, polystyrene, polyvinyl and polypropylene in the concrete mix design. In this study, the impact of partially replaced fine aggregate with polypropylene in the concrete mix design was investigated to evaluate the concrete’s compressive strength by conducting an experimental work which comprises of six concrete mix batches with polypropylene replacements ranging from 0.5 to 3.0%. The results demonstrated a typical decline in the compressive strength with the addition of plastic aggregate, despite this reduction generally mitigated as the level of plastic in the concrete mix increased. Furthermore, two of the six plastic-containing concrete mixes tested in the current study exceeded the ST5 standardised prescribed concrete mix compressive strength requirement at 28-days containing 1.50% and 2.50% plastic aggregates, which demonstrated the potential for use of recycled polypropylene in structural applications, as a partial by mass, fine aggregate replacement in the concrete mix.

Keywords: Compressive strength, concrete, polypropylene, sustainability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 942
8552 Routing Medical Images with Tabu Search and Simulated Annealing: A Study on Quality of Service

Authors: Mejía M. Paula, Ramírez L. Leonardo, Puerta A. Gabriel

Abstract:

In telemedicine, the image repository service is important to increase the accuracy of diagnostic support of medical personnel. This study makes comparison between two routing algorithms regarding the quality of service (QoS), to be able to analyze the optimal performance at the time of loading and/or downloading of medical images. This study focused on comparing the performance of Tabu Search with other heuristic and metaheuristic algorithms that improve QoS in telemedicine services in Colombia. For this, Tabu Search and Simulated Annealing heuristic algorithms are chosen for their high usability in this type of applications; the QoS is measured taking into account the following metrics: Delay, Throughput, Jitter and Latency. In addition, routing tests were carried out on ten images in digital image and communication in medicine (DICOM) format of 40 MB. These tests were carried out for ten minutes with different traffic conditions, reaching a total of 25 tests, from a server of Universidad Militar Nueva Granada (UMNG) in Bogotá-Colombia to a remote user in Universidad de Santiago de Chile (USACH) - Chile. The results show that Tabu search presents a better QoS performance compared to Simulated Annealing, managing to optimize the routing of medical images, a basic requirement to offer diagnostic images services in telemedicine.

Keywords: Medical image, QoS, simulated annealing, Tabu search, telemedicine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 956
8551 Meta-Analysis of the Impact of Positive Psychological Capital on Employees Outcomes: The Moderating Role of Tenure

Authors: Hyeondal Jeong, Yoonjung Baek

Abstract:

This research examines the effects of positive psychological capital (or PsyCap) on employee’s outcomes (satisfaction, commitment, organizational citizenship behavior, innovation behavior and individual creativity). This study conducted a meta-analysis of articles published in the Republic of Korea. As a result, positive psychological capital has a positive effect on the behavior of employees. Heterogeneity was identified among the studies included in the analysis and the context factors were analyzed; the study proposes contextual factors such as team tenure. The moderating effect of team tenure was not statistically significant. The implications were discussed based on the analysis results.

Keywords: Positive psychological capital, satisfaction, commitment, OCB, creativity, meta-analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1689
8550 Using HABIT to Establish the Chemicals Analysis Methodology for Maanshan Nuclear Power Plant

Authors: J. R. Wang, S. W. Chen, Y. Chiang, W. S. Hsu, J. H. Yang, Y. S. Tseng, C. Shih

Abstract:

In this research, the HABIT analysis methodology was established for Maanshan nuclear power plant (NPP). The Final Safety Analysis Report (FSAR), reports, and other data were used in this study. To evaluate the control room habitability under the CO2 storage burst, the HABIT methodology was used to perform this analysis. The HABIT result was below the R.G. 1.78 failure criteria. This indicates that Maanshan NPP habitability can be maintained. Additionally, the sensitivity study of the parameters (wind speed, atmospheric stability classification, air temperature, and control room intake flow rate) was also performed in this research.

Keywords: PWR, HABIT, habitability, Maanshan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 910
8549 Dynamic Analysis of Transmission Line Towers

Authors: Srikanth L., Neelima Satyam D.

Abstract:

The transmission line towers are one of the important life line structures in the distribution of power from the source to the various places for several purposes. The predominant external loads which act on these towers are wind and earthquake loads. In this present study tower is analyzed using Indian Standards IS: 875:1987(Wind Load), IS: 802:1995(Structural steel), IS:1893:2002 (Earthquake) and dynamic analysis of tower has been performed considering ground motion of 2001 Bhuj Earthquake (India). The dynamic analysis was performed considering a tower system consisting two towers spaced 800m apart and 35m height each. This analysis has been performed using numerical time stepping finite difference method which is central difference method were employed by a developed MATLAB program to get the normalized ground motion parameters includes acceleration, frequency, velocity which are important in designing the tower. The tower is analyzed using response spectrum analysis.

Keywords: Response Spectra, Dynamic Analysis, Central Difference Method, Transmission Tower.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4088
8548 Efficiency of the Slovak Commercial Banks Applying the DEA Window Analysis

Authors: Iveta Řepková

Abstract:

The aim of this paper is to estimate the efficiency of the Slovak commercial banks employing the Data Envelopment Analysis (DEA) window analysis approach during the period 2003-2012. The research is based on unbalanced panel data of the Slovak commercial banks. Undesirable output was included into analysis of banking efficiency. It was found that most efficient banks were Postovabanka, UniCredit Bank and Istrobanka in CCR model and the most efficient banks were Slovenskasporitelna, Istrobanka and UniCredit Bank in BCC model. On contrary, the lowest efficient banks were found Privatbanka and CitiBank. We found that the largest banks in the Slovak banking market were lower efficient than medium-size and small banks. Results of the paper is that during the period 2003-2008 the average efficiency was increasing and then during the period 2010-2011 the average efficiency decreased as a result of financial crisis.

Keywords: Data Envelopment Analysis, efficiency, Slovak banking sector, window analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2655
8547 Non-negative Principal Component Analysis for Face Recognition

Authors: Zhang Yan, Yu Bin

Abstract:

Principle component analysis is often combined with the state-of-art classification algorithms to recognize human faces. However, principle component analysis can only capture these features contributing to the global characteristics of data because it is a global feature selection algorithm. It misses those features contributing to the local characteristics of data because each principal component only contains some levels of global characteristics of data. In this study, we present a novel face recognition approach using non-negative principal component analysis which is added with the constraint of non-negative to improve data locality and contribute to elucidating latent data structures. Experiments are performed on the Cambridge ORL face database. We demonstrate the strong performances of the algorithm in recognizing human faces in comparison with PCA and NREMF approaches.

Keywords: classification, face recognition, non-negativeprinciple component analysis (NPCA)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1695
8546 The Applications of Quantum Mechanics Simulation for Solvent Selection in Chemicals Separation

Authors: Attapong T., Hong-Ming Ku, Nakarin M., Narin L., Alisa L, Jirut W.

Abstract:

The quantum mechanics simulation was applied for calculating the interaction force between 2 molecules based on atomic level. For the simple extractive distillation system, it is ternary components consisting of 2 closed boiling point components (A,lower boiling point and B, higher boiling point) and solvent (S). The quantum mechanics simulation was used to calculate the intermolecular force (interaction force) between the closed boiling point components and solvents consisting of intermolecular between A-S and B-S. The requirement of the promising solvent for extractive distillation is that solvent (S) has to form stronger intermolecular force with only one component than the other component (A or B). In this study, the systems of aromatic-aromatic, aromatic-cycloparaffin, and paraffindiolefin systems were selected as the demonstration for solvent selection. This study defined new term using for screening the solvents called relative interaction force which is calculated from the quantum mechanics simulation. The results showed that relative interaction force gave the good agreement with the literature data (relative volatilities from the experiment). The reasons are discussed. Finally, this study suggests that quantum mechanics results can improve the relative volatility estimation for screening the solvents leading to reduce time and money consuming

Keywords: Extractive distillation, Interaction force, Quamtum mechanic, Relative volatility, Solvent extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1593
8545 Input-Output Analysis in Laptop Computer Manufacturing

Authors: H. Z. Ulukan, E. Demircioğlu, M. Erol Genevois

Abstract:

The scope of this paper and the aim of proposed model were to apply monetary Input –Output (I-O) analysis to point out the importance of reusing know-how and other requirements in order to reduce the production costs in a manufacturing process for a laptop computer. I-O approach using the monetary input-output model is employed to demonstrate the impacts of different factors in a manufacturing process. A sensitivity analysis showing the correlation between these different factors is also presented. It is expected that the recommended model would have an advantageous effect in the cost minimization process.

Keywords: Input-Output Analysis, Monetary Input-Output Model, Manufacturing Process, Laptop Computer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4627
8544 A Methodology to Analyze Technology Convergence: Patent-Citation Based Technology Input-Output Analysis

Authors: Jeeeun Kim, Sungjoo Lee

Abstract:

This research proposes a methodology for patent-citation-based technology input-output analysis by applying the patent information to input-output analysis developed for the dependencies among different industries. For this analysis, a technology relationship matrix and its components, as well as input and technology inducement coefficients, are constructed using patent information. Then, a technology inducement coefficient is calculated by normalizing the degree of citation from certain IPCs to the different IPCs (International patent classification) or to the same IPCs. Finally, we construct a Dependency Structure Matrix (DSM) based on the technology inducement coefficient to suggest a useful application for this methodology.

Keywords: Technology spillover effect, technology relationship, IO table, technology inducement coefficients, patent analysis, patent citation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2573
8543 Computational Design of Inhibitory Agents of BMP-Noggin Interaction to Promote Osteogenesis

Authors: Shaila Ahmed, Raghu Prasad Rao Metpally, Sreedhara Sangadala, Boojala Vijay B Reddy

Abstract:

Bone growth factors, such as Bone Morphogenic Protein-2 (BMP-2) have been approved by the FDA to replace grafting for some surgical interventions, but the high dose requirement limits its use in patients. Noggin, an extracellular protein, blocks the effect of BMP-2 by binding to BMP. Preventing the BMP-2/noggin interaction will help increase the free concentration of BMP-2 and therefore should enhance its efficacy to induce bone formation. The work presented here involves computational design of novel small molecule inhibitory agents of BMP-2/noggin interaction, based on our current understanding of BMP-2, and its known putative ligands (receptors and antagonists). A successful acquisition of such an inhibitory agent of BMP-2/noggin interaction would allow clinicians to reduce the dose required of BMP-2 protein in clinical applications to promote osteogenesis. The available crystal structures of the BMPs, its receptors, and the binding partner noggin were analyzed to identify the critical residues involved in their interaction. In presenting this study, LUDI de novo design method was utilized to perform virtual screening of a large number of compounds from a commercially available library against the binding sites of noggin to identify the lead chemical compounds that could potentially block BMP-noggin interaction with a high specificity.

Keywords: Transforming growth factor-beta, Bone morphogenic proteins, Noggin, LUDI de novo design method, CAP small molecules.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1920
8542 Parametric and Nonparametric Analysis of Breast Cancer Treatments

Authors: Chunling Cong, Chris.P.Tsokos

Abstract:

The objective of the present research manuscript is to perform parametric, nonparametric, and decision tree analysis to evaluate two treatments that are being used for breast cancer patients. Our study is based on utilizing real data which was initially used in “Tamoxifen with or without breast irradiation in women of 50 years of age or older with early breast cancer" [1], and the data is supplied to us by N.A. Ibrahim “Decision tree for competing risks survival probability in breast cancer study" [2]. We agree upon certain aspects of our findings with the published results. However, in this manuscript, we focus on relapse time of breast cancer patients instead of survival time and parametric analysis instead of semi-parametric decision tree analysis is applied to provide more precise recommendations of effectiveness of the two treatments with respect to reoccurrence of breast cancer.

Keywords: decision tree, breast cancer treatments, parametricanalysis, non-parametric analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2052
8541 Numerical and Experimental Investigations of Cantilever Rectangular Plate Structure on Subsonic Flutter

Authors: Mevlüt Burak Dalmış, Kemal Yaman

Abstract:

In this study, flutter characteristics of cantilever rectangular plate structure under incompressible flow regime are investigated by comparing the results of commercial flutter analysis program ZAERO© with wind tunnel tests conducted in Ankara Wind Tunnel (ART). A rectangular polycarbonate (PC) plate, 5x125x1000 mm in dimensions, is used for both numerical and experimental investigations. Analysis and test results are very compatible with each other. A comparison between two different solution methods (g and k-method) of ZAERO© is also done. It is seen that, k-method gives closer result than the other one. However, g-method results are on conservative side and it is better to use conservative results namely g-method results. Even if the modal analysis results are used for the flutter analysis for this simple structure, a modal test should be conducted in order to validate the modal analysis results to have accurate flutter analysis results for more complicated structures.

Keywords: Flutter, plate, subsonic flow, wind tunnel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 958
8540 Neural Network Implementation Using FPGA: Issues and Application

Authors: A. Muthuramalingam, S. Himavathi, E. Srinivasan

Abstract:

.Hardware realization of a Neural Network (NN), to a large extent depends on the efficient implementation of a single neuron. FPGA-based reconfigurable computing architectures are suitable for hardware implementation of neural networks. FPGA realization of ANNs with a large number of neurons is still a challenging task. This paper discusses the issues involved in implementation of a multi-input neuron with linear/nonlinear excitation functions using FPGA. Implementation method with resource/speed tradeoff is proposed to handle signed decimal numbers. The VHDL coding developed is tested using Xilinx XC V50hq240 Chip. To improve the speed of operation a lookup table method is used. The problems involved in using a lookup table (LUT) for a nonlinear function is discussed. The percentage saving in resource and the improvement in speed with an LUT for a neuron is reported. An attempt is also made to derive a generalized formula for a multi-input neuron that facilitates to estimate approximately the total resource requirement and speed achievable for a given multilayer neural network. This facilitates the designer to choose the FPGA capacity for a given application. Using the proposed method of implementation a neural network based application, namely, a Space vector modulator for a vector-controlled drive is presented

Keywords: FPGA implementation, multi-input neuron, neural network, nn based space vector modulator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4424
8539 Variance Based Component Analysis for Texture Segmentation

Authors: Zeinab Ghasemi, S. Amirhassan Monadjemi, Abbas Vafaei

Abstract:

This paper presents a comparative analysis of a new unsupervised PCA-based technique for steel plates texture segmentation towards defect detection. The proposed scheme called Variance Based Component Analysis or VBCA employs PCA for feature extraction, applies a feature reduction algorithm based on variance of eigenpictures and classifies the pixels as defective and normal. While the classic PCA uses a clusterer like Kmeans for pixel clustering, VBCA employs thresholding and some post processing operations to label pixels as defective and normal. The experimental results show that proposed algorithm called VBCA is 12.46% more accurate and 78.85% faster than the classic PCA.

Keywords: Principal Component Analysis; Variance Based Component Analysis; Defect Detection; Texture Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1973
8538 Transient Analysis of a Single-Server Queue with Batch Arrivals Using Modeling and Functions Akin to the Modified Bessel Functions

Authors: Vitalice K. Oduol

Abstract:

The paper considers a single-server queue with fixedsize batch Poisson arrivals and exponential service times, a model that is useful for a buffer that accepts messages arriving as fixed size batches of packets and releases them one packet at time. Transient performance measures for queues have long been recognized as being complementary to the steady-state analysis. The focus of the paper is on the use of the functions that arise in the analysis of the transient behaviour of the queuing system. The paper exploits practical modelling to obtain a solution to the integral equation encountered in the analysis. Results obtained indicate that under heavy load conditions, there is significant disparity in the statistics between the transient and steady state values.

Keywords: batch arrivals, modelling, single-server queue, time-varying probabilities, transient analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1531
8537 Resistance Analysis for a Trimaran

Authors: C. M. De Marco Muscat-Fenech, A. M. Grech La Rosa

Abstract:

Although a lot of importance has been given to resistance analysis for various vessel types, explicit guidelines applied to multihull vessels have not been clearly defined.  The purpose of this investigation is to highlight the importance of the vessel’s layout in terms of three axes, the transverse (separation), the longitudinal (stagger) and the vertical (draught) with respect to resistance analysis. When a vessel has the potential to experience less resistance at a particular range of speeds a vast selection of opportunities are made available for both the commercial and leisure market.

Keywords: Multihull, Reistance, Trimaran.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3978
8536 Physico-chemical Treatment of Tar-Containing Wastewater Generated from Biomass Gasification Plants

Authors: Vrajesh Mehta, Anal Chavan

Abstract:

Treatment of tar-containing wastewater is necessary for the successful operation of biomass gasification plants (BGPs). In the present study, tar-containing wastewater was treated using lime and alum for the removal of in-organics, followed by adsorption on powdered activated carbon (PAC) for the removal of organics. Limealum experiments were performed in a jar apparatus and activated carbon studies were performed in an orbital shaker. At optimum concentrations, both lime and alum individually proved to be capable of removing color, total suspended solids (TSS) and total dissolved solids (TDS), but in both cases, pH adjustment had to be carried out after treatment. The combination of lime and alum at the dose ratio of 0.8:0.8 g/L was found to be optimum for the removal of inorganics. The removal efficiency achieved at optimum concentrations were 78.6, 62.0, 62.5 and 52.8% for color, alkalinity, TSS and TDS, respectively. The major advantages of the lime-alum combination were observed to be as follows: no requirement of pH adjustment before and after treatment and good settleability of sludge. Coagulation-precipitation followed by adsorption on PAC resulted in 92.3% chemical oxygen demand (COD) removal and 100% phenol removal at equilibrium. Ammonia removal efficiency was found to be 11.7% during coagulation-flocculation and 36.2% during adsorption on PAC. Adsorption of organics on PAC in terms of COD and phenol followed Freundlich isotherm with Kf = 0.55 & 18.47 mg/g and n = 1.01 & 1.45, respectively. This technology may prove to be one of the fastest and most techno-economically feasible methods for the treatment of tar-containing wastewater generated from BGPs.

Keywords: Activated carbon, Alum, Biomass gasification, Coagulation-flocculation, Lime, Tar-containing wastewater.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3672
8535 Effect of Crude Oil Particle Elasticity on the Separation Efficiency of a Hydrocyclone

Authors: M. H. Narasingha, K. Pana-Suppamassadu, P. Narataruksa

Abstract:

The separation efficiency of a hydrocyclone has extensively been considered on the rigid particle assumption. A collection of experimental studies have demonstrated their discrepancies from the modeling and simulation results. These discrepancies caused by the actual particle elasticity have generally led to a larger amount of energy consumption in the separation process. In this paper, the influence of particle elasticity on the separation efficiency of a hydrocyclone system was investigated through the Finite Element (FE) simulations using crude oil droplets as the elastic particles. A Reitema-s design hydrocyclone with a diameter of 8 mm was employed to investigate the separation mechanism of the crude oil droplets from water. The cut-size diameter eter of the crude oil was 10 - Ðçm in order to fit with the operating range of the adopted hydrocylone model. Typical parameters influencing the performance of hydrocyclone were varied with the feed pressure in the range of 0.3 - 0.6 MPa and feed concentration between 0.05 – 0.1 w%. In the simulation, the Finite Element scheme was applied to investigate the particle-flow interaction occurred in the crude oil system during the process. The interaction of a single oil droplet at the size of 10 - Ðçm to the flow field was observed. The feed concentration fell in the dilute flow regime so the particle-particle interaction was ignored in the study. The results exhibited the higher power requirement for the separation of the elastic particulate system when compared with the rigid particulate system.

Keywords: Hydrocyclone, separation efficiency, strain energy density, strain rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1804
8534 White Blood Cells Identification and Counting from Microscopic Blood Image

Authors: Lorenzo Putzu, Cecilia Di Ruberto

Abstract:

The counting and analysis of blood cells allows the evaluation and diagnosis of a vast number of diseases. In particular, the analysis of white blood cells (WBCs) is a topic of great interest to hematologists. Nowadays the morphological analysis of blood cells is performed manually by skilled operators. This involves numerous drawbacks, such as slowness of the analysis and a nonstandard accuracy, dependent on the operator skills. In literature there are only few examples of automated systems in order to analyze the white blood cells, most of which only partial. This paper presents a complete and fully automatic method for white blood cells identification from microscopic images. The proposed method firstly individuates white blood cells from which, subsequently, nucleus and cytoplasm are extracted. The whole work has been developed using MATLAB environment, in particular the Image Processing Toolbox.

Keywords: Automatic detection, Biomedical image processing, Segmentation, White blood cell analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8904
8533 Combustion Analysis of Suspended Sodium Droplet

Authors: T. Watanabe

Abstract:

Combustion analysis of suspended sodium droplet is performed by solving numerically the Navier-Stokes equations and the energy conservation equations. The combustion model consists of the pre-ignition and post-ignition models. The reaction rate for the pre-ignition model is based on the chemical kinetics, while that for the post-ignition model is based on the mass transfer rate of oxygen. The calculated droplet temperature is shown to be in good agreement with the existing experimental data. The temperature field in and around the droplet is obtained as well as the droplet shape variation, and the present numerical model is confirmed to be effective for the combustion analysis.

Keywords: Combustion, analysis, sodium, droplet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 698
8532 Multi-Objective Optimization in End Milling of Al-6061 Using Taguchi Based G-PCA

Authors: M. K. Pradhan, Mayank Meena, Shubham Sen, Arvind Singh

Abstract:

In this study, a multi objective optimization for end milling of Al 6061 alloy has been presented to provide better surface quality and higher Material Removal Rate (MRR). The input parameters considered for the analysis are spindle speed, depth of cut and feed. The experiments were planned as per Taguchis design of experiment, with L27 orthogonal array. The Grey Relational Analysis (GRA) has been used for transforming multiple quality responses into a single response and the weights of the each performance characteristics are determined by employing the Principal Component Analysis (PCA), so that their relative importance can be properly and objectively described. The results reveal that Taguchi based G-PCA can effectively acquire the optimal combination of cutting parameters.

Keywords: Material Removal Rate, Surface Roughness, Taguchi Method, Grey Relational Analysis, Principal Component Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2227
8531 Effect of Infills in Influencing the Dynamic Responses of Multistoried Structures

Authors: E. Rahmathulla Noufal

Abstract:

Investigating the dynamic responses of high rise structures under the effect of siesmic ground motion is extremely important for the proper analysis and design of multitoried structures. Since the presence of infilled walls strongly influences the behaviour of frame systems in multistoried buildings, there is an increased need for developing guidelines for the analysis and design of infilled frames under the effect of dynamic loads for safe and proper design of buildings. In this manuscript, we evaluate the natural frequencies and natural periods of single bay single storey frames considering the effect of infill walls by using the Eigen value analysis and validating with SAP 2000 (free vibration analysis). Various parameters obtained from the diagonal strut model followed for the free vibration analysis is then compared with the Finite Element model, where infill is modeled as shell elements (four noded). We also evaluated the effect of various parameters on the natural periods of vibration obtained by free vibration analysis in SAP 2000 comparing them with those obtained by the empirical expressions presented in I.S. 1893(Part I)- 2002.

Keywords: Infilled frame, eigen value analysis, free vibration analysis, diagonal strut model, finite element model, SAP 2000, natural period.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1716
8530 Modal Analysis of Power System with a Microgrid

Authors: Burak Yildirim, Muhsin Tunay Gençoğlu

Abstract:

A microgrid (MG) is a small power grid composed of localized medium or low level power generation, storage systems, and loads. In this paper, the effects of a MG on power systems voltage stability are shown. The MG model, designed to demonstrate the effects of the MG, was applied to the IEEE 14 bus power system which is widely used in power system stability studies. Eigenvalue and modal analysis methods were used in simulation studies. In the study results, it is seen that MGs affect system voltage stability positively by increasing system voltage instability limit value for buses of a power system in which MG are placed.

Keywords: Eigenvalue analysis, microgrid, modal analysis, voltage stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1610
8529 Interbank Networks and the Benefits of Using Multilayer Structures

Authors: Danielle Sandler dos Passos, Helder Coelho, Flávia Mori Sarti

Abstract:

Complexity science seeks the understanding of systems adopting diverse theories from various areas. Network analysis has been gaining space and credibility, namely with the biological, social and economic systems. Significant part of the literature focuses only monolayer representations of connections among agents considering one level of their relationships, and excludes other levels of interactions, leading to simplistic results in network analysis. Therefore, this work aims to demonstrate the advantages of the use of multilayer networks for the representation and analysis of networks. For this, we analyzed an interbank network, composed of 42 banks, comparing the centrality measures of the agents (degree and PageRank) resulting from each method (monolayer x multilayer). This proved to be the most reliable and efficient the multilayer analysis for the study of the current networks and highlighted JP Morgan and Deutsche Bank as the most important banks of the analyzed network.

Keywords: Complexity, interbank networks, multilayer networks, network analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 851
8528 Failure Analysis of a 304 Stainless Steel Flange Crack at Pipeline Transportation of Ethylene

Authors: Parisa Hasanpour, Bahram Borooghani, Vahid Asadi

Abstract:

In the current research, a catastrophic failure of a 304 stainless steel flange at pipeline transportation of ethylene in a petrochemical refinery was studied. Cracking was found in the flange after about 78840h service. Through the chemical analysis and tensile tests, in addition to microstructural analysis such as optical microscopy and Scanning Electron Microscopy (SEM) on the failed part, it found that the fatigue was responsible for the fracture of the flange, which originated from bumps and depressions on the outer surface and propagated by vibration caused by the working condition.

Keywords: Failure analysis, 304 stainless steel, fatigue, flange, petrochemical refinery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 293
8527 Local Curvelet Based Classification Using Linear Discriminant Analysis for Face Recognition

Authors: Mohammed Rziza, Mohamed El Aroussi, Mohammed El Hassouni, Sanaa Ghouzali, Driss Aboutajdine

Abstract:

In this paper, an efficient local appearance feature extraction method based the multi-resolution Curvelet transform is proposed in order to further enhance the performance of the well known Linear Discriminant Analysis(LDA) method when applied to face recognition. Each face is described by a subset of band filtered images containing block-based Curvelet coefficients. These coefficients characterize the face texture and a set of simple statistical measures allows us to form compact and meaningful feature vectors. The proposed method is compared with some related feature extraction methods such as Principal component analysis (PCA), as well as Linear Discriminant Analysis LDA, and independent component Analysis (ICA). Two different muti-resolution transforms, Wavelet (DWT) and Contourlet, were also compared against the Block Based Curvelet-LDA algorithm. Experimental results on ORL, YALE and FERET face databases convince us that the proposed method provides a better representation of the class information and obtains much higher recognition accuracies.

Keywords: Curvelet, Linear Discriminant Analysis (LDA) , Contourlet, Discreet Wavelet Transform, DWT, Block-based analysis, face recognition (FR).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1808