Search results for: residual analysis.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8797

Search results for: residual analysis.

8557 Finite Element Simulation of Multi-Stage Deep Drawing Processes and Comparison with Experimental Results

Authors: A. Pourkamali Anaraki, M. Shahabizadeh, B. Babaee

Abstract:

The plastic forming process of sheet plate takes an important place in forming metals. The traditional techniques of tool design for sheet forming operations used in industry are experimental and expensive methods. Prediction of the forming results, determination of the punching force, blank holder forces and the thickness distribution of the sheet metal will decrease the production cost and time of the material to be formed. In this paper, multi-stage deep drawing simulation of an Industrial Part has been presented with finite element method. The entire production steps with additional operations such as intermediate annealing and springback has been simulated by ABAQUS software under axisymmetric conditions. The simulation results such as sheet thickness distribution, Punch force and residual stresses have been extracted in any stages and sheet thickness distribution was compared with experimental results. It was found through comparison of results, the FE model have proven to be in close agreement with those of experiment.

Keywords: Deep drawing, Finite element method, Simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5029
8556 Containment/Penetration Analysis for the Protection of Aircraft Engine External Configuration and Nuclear Power Plant Structures

Authors: Dong Wook Lee, Adrian Mistreanu

Abstract:

The authors have studied a method for analyzing containment and penetration using an explicit nonlinear Finite Element Analysis. This method may be used in the stage of concept design for the protection of external configurations or components of aircraft engines and nuclear power plant structures. This paper consists of the modeling method, the results obtained from the method and the comparison of the results with those calculated from simple analytical method. It shows that the containment capability obtained by proposed method matches well with analytically calculated containment capability.

Keywords: Computer Aided Engineering, CAE, containment analysis, Finite Element Analysis, FEA, impact analysis, penetration analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 492
8555 Recovery of Cu, Zn, Ni and Cr from Plating Sludge by Combined Sulfidation and Oxidation Treatment

Authors: D. Kuchar, T. Fukuta, M. Kubota, H. Matsuda

Abstract:

The selective recovery of heavy metals of Cu, Zn, Ni and Cr from a mixed plating sludge by sulfidation and oxidation treatment was targeted in this study. At first, the mixed plating sludge was simultaneously subjected to an extraction and Cu sulfidation process at pH=1.5 to dissolve heavy metals and to precipitate Cu2+ as CuS. In the next step, the sulfidation treatment of Zn was carried out at pH=4.5 and the residual solution was subjected to an oxidation treatment of chromium with H2O2 at pH=10.0. After the experiments, the selectivity of metal precipitation and the chromium oxidation ratio were evaluated. As results, it was found that the filter cake obtained after selective sulfidation of Cu was composed of 96.6% of Cu (100% equals to the sum of Cu, Zn, Ni and Cr contents). Such findings confirmed that almost complete extraction of heavy metals was achieved at pH=1.5 and also that Cu could be selectively recovered as CuS. Further, the filter cake obtained at pH=4.5 was composed of 91.5% Zn and 6.83% of Cr. Regarding the chromium oxidation step, the chromium oxidation ratio was found to increase with temperature and the addition of oxidation agent of H2O2, but only oxidation ratio of 59% was achieved at a temperature of 60°C and H2O2 to Cr3+ equivalent ratio of 180.

Keywords: Chromium recovery, oxidation, plating sludge, sulfidation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2599
8554 An Energy-Efficient Distributed Unequal Clustering Protocol for Wireless Sensor Networks

Authors: Sungju Lee, Jangsoo Lee , Hongjoong Sin, Seunghwan Yoo, Sanghyuck Lee, Jaesik Lee, Yongjun Lee, Sungchun Kim

Abstract:

The wireless sensor networks have been extensively deployed and researched. One of the major issues in wireless sensor networks is a developing energy-efficient clustering protocol. Clustering algorithm provides an effective way to prolong the lifetime of a wireless sensor networks. In the paper, we compare several clustering protocols which significantly affect a balancing of energy consumption. And we propose an Energy-Efficient Distributed Unequal Clustering (EEDUC) algorithm which provides a new way of creating distributed clusters. In EEDUC, each sensor node sets the waiting time. This waiting time is considered as a function of residual energy, number of neighborhood nodes. EEDUC uses waiting time to distribute cluster heads. We also propose an unequal clustering mechanism to solve the hot-spot problem. Simulation results show that EEDUC distributes the cluster heads, balances the energy consumption well among the cluster heads and increases the network lifetime.

Keywords: Wireless Sensor Network, Distributed UnequalClustering, Multi-hop, Lifetime.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2447
8553 Using SNAP and RADTRAD to Establish the Analysis Model for Maanshan PWR Plant

Authors: J. R. Wang, H. C. Chen, C. Shih, S. W. Chen, J. H. Yang, Y. Chiang

Abstract:

In this study, we focus on the establishment of the analysis model for Maanshan PWR nuclear power plant (NPP) by using RADTRAD and SNAP codes with the FSAR, manuals, and other data. In order to evaluate the cumulative dose at the Exclusion Area Boundary (EAB) and Low Population Zone (LPZ) outer boundary, Maanshan NPP RADTRAD/SNAP model was used to perform the analysis of the DBA LOCA case. The analysis results of RADTRAD were similar to FSAR data. These analysis results were lower than the failure criteria of 10 CFR 100.11 (a total radiation dose to the whole body, 250 mSv; a total radiation dose to the thyroid from iodine exposure, 3000 mSv).

Keywords: RADionuclide, transport, removal, and dose estimation, RADTRAD, symbolic nuclear analysis package, SNAP, dose, PWR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 982
8552 Kinematic and Dynamic Analysis of a Lower Limb Exoskeleton

Authors: Tawakal Hasnain Baluch, Adnan Masood, Javaid Iqbal, Umer Izhar, Umar Shahbaz Khan

Abstract:

This paper will provide the kinematic and dynamic analysis of a lower limb exoskeleton. The forward and inverse kinematics of proposed exoskeleton is performed using Denevit and Hartenberg method. The torques required for the actuators will be calculated using Lagrangian formulation technique. This research can be used to design the control of the proposed exoskeleton.

Keywords: Dynamic Analysis, Exoskeleton, Kinematic Analysis, Lower Limb, Rehabilitation Robotics

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4555
8551 Adaptive Kaman Filter for Fault Diagnosis of Linear Parameter-Varying Systems

Authors: Rajamani Doraiswami, Lahouari Cheded

Abstract:

Fault diagnosis of Linear Parameter-Varying (LPV) system using an adaptive Kalman filter is proposed. The LPV model is comprised of scheduling parameters, and the emulator parameters. The scheduling parameters are chosen such that they are capable of tracking variations in the system model as a result of changes in the operating regimes. The emulator parameters, on the other hand, simulate variations in the subsystems during the identification phase and have negligible effect during the operational phase. The nominal model and the influence vectors, which are the gradient of the feature vector respect to the emulator parameters, are identified off-line from a number of emulator parameter perturbed experiments. A Kalman filter is designed using the identified nominal model. As the system varies, the Kalman filter model is adapted using the scheduling variables. The residual is employed for fault diagnosis. The proposed scheme is successfully evaluated on simulated system as well as on a physical process control system.

Keywords: Keywords—Identification, linear parameter-varying systems, least-squares estimation, fault diagnosis, Kalman filter, emulators

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1269
8550 Numerical Solution of Steady Magnetohydrodynamic Boundary Layer Flow Due to Gyrotactic Microorganism for Williamson Nanofluid over Stretched Surface in the Presence of Exponential Internal Heat Generation

Authors: M. A. Talha, M. Osman Gani, M. Ferdows

Abstract:

This paper focuses on the study of two dimensional magnetohydrodynamic (MHD) steady incompressible viscous Williamson nanofluid with exponential internal heat generation containing gyrotactic microorganism over a stretching sheet. The governing equations and auxiliary conditions are reduced to a set of non-linear coupled differential equations with the appropriate boundary conditions using similarity transformation. The transformed equations are solved numerically through spectral relaxation method. The influences of various parameters such as Williamson parameter γ, power constant λ, Prandtl number Pr, magnetic field parameter M, Peclet number Pe, Lewis number Le, Bioconvection Lewis number Lb, Brownian motion parameter Nb, thermophoresis parameter Nt, and bioconvection constant σ are studied to obtain the momentum, heat, mass and microorganism distributions. Moment, heat, mass and gyrotactic microorganism profiles are explored through graphs and tables. We computed the heat transfer rate, mass flux rate and the density number of the motile microorganism near the surface. Our numerical results are in better agreement in comparison with existing calculations. The Residual error of our obtained solutions is determined in order to see the convergence rate against iteration. Faster convergence is achieved when internal heat generation is absent. The effect of magnetic parameter M decreases the momentum boundary layer thickness but increases the thermal boundary layer thickness. It is apparent that bioconvection Lewis number and bioconvection parameter has a pronounced effect on microorganism boundary. Increasing brownian motion parameter and Lewis number decreases the thermal boundary layer. Furthermore, magnetic field parameter and thermophoresis parameter has an induced effect on concentration profiles.

Keywords: Convection flow, internal heat generation, similarity, spectral method, numerical analysis, Williamson nanofluid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 919
8549 Influence of Mass Flow Rate on Forced Convective Heat Transfer through a Nanofluid Filled Direct Absorption Solar Collector

Authors: Salma Parvin, M. A. Alim

Abstract:

The convective and radiative heat transfer performance and entropy generation on forced convection through a direct absorption solar collector (DASC) is investigated numerically. Four different fluids, including Cu-water nanofluid, Al2O3-waternanofluid, TiO2-waternanofluid, and pure water are used as the working fluid. Entropy production has been taken into account in addition to the collector efficiency and heat transfer enhancement. Penalty finite element method with Galerkin’s weighted residual technique is used to solve the governing non-linear partial differential equations. Numerical simulations are performed for the variation of mass flow rate. The outcomes are presented in the form of isotherms, average output temperature, the average Nusselt number, collector efficiency, average entropy generation, and Bejan number. The results present that the rate of heat transfer and collector efficiency enhance significantly for raising the values of m up to a certain range.

Keywords: DASC, forced convection, mass flow rate, nanofluid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 820
8548 Constructivism Learning Management in Mathematical Analysis Courses

Authors: K. Paisal

Abstract:

The purposes of this research were (1) to create a learning activity for constructivism, (2) study the Mathematical Analysis courses learning achievement, and (3) study students’ attitude toward the learning activity for constructivism. The samples in this study were divided into 2 parts including 3 Mathematical Analysis courses instructors of Suan Sunandha Rajabhat University who provided basic information and attended the seminar and 17 Mathematical Analysis courses students who were studying in the academic and engaging in the learning activity for constructivism. The research instruments were lesson plans constructivism, subjective Mathematical Analysis courses achievement test with reliability index of 0.8119, and an attitude test concerning the students’ attitude toward the Mathematical Analysis courses learning activity for constructivism. The result of the research show that the efficiency of the Mathematical Analysis courses learning activity for constructivism is 73.05/72.16, which is more than expected criteria of 70/70. The research additionally find that the average score of learning achievement of students who engaged in the learning activities for constructivism are equal to 70% and the students’ attitude toward the learning activity for constructivism are at the medium level.

Keywords: Constructivism, learning management, Mathematical Analysis courses.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1822
8547 Revisiting the Concept of Risk Analysis within the Context of Geospatial Database Design: A Collaborative Framework

Authors: J. Grira, Y. Bédard, S. Roche

Abstract:

The aim of this research is to design a collaborative framework that integrates risk analysis activities into the geospatial database design (GDD) process. Risk analysis is rarely undertaken iteratively as part of the present GDD methods in conformance to requirement engineering (RE) guidelines and risk standards. Accordingly, when risk analysis is performed during the GDD, some foreseeable risks may be overlooked and not reach the output specifications especially when user intentions are not systematically collected. This may lead to ill-defined requirements and ultimately in higher risks of geospatial data misuse. The adopted approach consists of 1) reviewing risk analysis process within the scope of RE and GDD, 2) analyzing the challenges of risk analysis within the context of GDD, and 3) presenting the components of a risk-based collaborative framework that improves the collection of the intended/forbidden usages of the data and helps geo-IT experts to discover implicit requirements and risks.

Keywords: Collaborative risk analysis, intention of use, Geospatial database design, Geospatial data misuse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1616
8546 A Trends Analysis of Dinghy Yacht Simulator

Authors: Jae-Neung Lee, Sung-Bum Pan, Keun-Chang Kwak

Abstract:

This paper describes an analysis of Yacht Simulator international trends and also explains about Yacht. The results are summarized as follows. Attached to the cockpit are sensors that feed -back information on rudder angle, boat heel angle and mainsheet tension to the computer. Energy expenditure of the sailor measure indirectly using expired gas analysis for the measurement of VO2 and VCO2. At sea course configurations and wind conditions can be preset to suit any level of sailor from complete beginner to advanced sailor.

Keywords: Trends Analysis, Yacht Simulator, Sailing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2194
8545 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: Predictive analysis, big data, predictive analysis algorithms. CART algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1024
8544 Efficient Variants of Square Contour Algorithm for Blind Equalization of QAM Signals

Authors: Ahmad Tariq Sheikh, Shahzad Amin Sheikh

Abstract:

A new distance-adjusted approach is proposed in which static square contours are defined around an estimated symbol in a QAM constellation, which create regions that correspond to fixed step sizes and weighting factors. As a result, the equalizer tap adjustment consists of a linearly weighted sum of adaptation criteria that is scaled by a variable step size. This approach is the basis of two new algorithms: the Variable step size Square Contour Algorithm (VSCA) and the Variable step size Square Contour Decision-Directed Algorithm (VSDA). The proposed schemes are compared with existing blind equalization algorithms in the SCA family in terms of convergence speed, constellation eye opening and residual ISI suppression. Simulation results for 64-QAM signaling over empirically derived microwave radio channels confirm the efficacy of the proposed algorithms. An RTL implementation of the blind adaptive equalizer based on the proposed schemes is presented and the system is configured to operate in VSCA error signal mode, for square QAM signals up to 64-QAM.

Keywords: Adaptive filtering, Blind Equalization, Square Contour Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1820
8543 Dynamic Instability in High-Rise SMRFs Subjected to Long-Period Ground Motions

Authors: Y. Araki, M. Kim, S. Okayama, K. Ikago, K. Uetani

Abstract:

We study dynamic instability in high-rise steel moment resisting frames (SMRFs) subjected to synthetic long-period ground motions caused by hypothetical huge subduction earthquakes. Since long duration as well as long dominant periods is a characteristic of long-period ground motions, interstory drifts may enter the negative postyield stiffness range many times when high-rise buildings are subjected to long-period ground motions. Through the case studies of 9 high-rise SMRFs designed in accordance with the Japanese design practice in 1980s, we demonstrate that drifting, or accumulation of interstory drifts in one direction, occurs at the lower stories of the SMRFs, if their natural periods are close to the dominant periods of the long-period ground motions. The drifting led to residual interstory drift ratio over 0.01, or to collapse if the design base shear was small.

Keywords: long-period ground motion, P-Delta effect, high-rise steel moment resisting frame (SMRF), subduction earthquake

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1882
8542 Hazardous Waste Generated in the Peruvian Textile Industry: Haute Couture, Alpaca Fiber and Tannery

Authors: Huiman C. Alberto

Abstract:

The research cites the various hazardous waste generated in the textile industry. The method used is descriptive and comparative, the process consisted of the search and evaluation of information, both nationally and internationally. The results indicate: (1) Waste is generated from the alpaca fiber industry in the various stages of camelid rearing, they stand out for their dangerousness: excreta, residual fiber and yarn scraps. (2) The main hazardous waste generated by the tannery industry are grease, hides, hair, plastic containers with traces of toxic substances, chips and pieces of leather with chrome. (3) Three companies' Solid Waste Management Plans were analyzed, randomly selected, and none of them detail waste treatment processes and warn of the lack of supervision by the authorities. It is concluded that the hazardous waste generated can affect human and environmental health. There is the possibility of taking advantage of certain hazardous waste such as manure and alpaca fiber, after treatment; while non-hazardous waste from the tannery such as yarn, panel weaving, cloth, scraps, and thread, can be used to produce new products, generating a production chain in favor of the entrepreneur himself.

Keywords: Alpaca fiber, excreta, Haute couture, hazardous waste tannery, hazardous waste treatment, textile waste,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 348
8541 Dynamic Correlations and Portfolio Optimization between Islamic and Conventional Equity Indexes: A Vine Copula-Based Approach

Authors: Imen Dhaou

Abstract:

This study examines conditional Value at Risk by applying the GJR-EVT-Copula model, and finds the optimal portfolio for eight Dow Jones Islamic-conventional pairs. Our methodology consists of modeling the data by a bivariate GJR-GARCH model in which we extract the filtered residuals and then apply the Peak over threshold model (POT) to fit the residual tails in order to model marginal distributions. After that, we use pair-copula to find the optimal portfolio risk dependence structure. Finally, with Monte Carlo simulations, we estimate the Value at Risk (VaR) and the conditional Value at Risk (CVaR). The empirical results show the VaR and CVaR values for an equally weighted portfolio of Dow Jones Islamic-conventional pairs. In sum, we found that the optimal investment focuses on Islamic-conventional US Market index pairs because of high investment proportion; however, all other index pairs have low investment proportion. These results deliver some real repercussions for portfolio managers and policymakers concerning to optimal asset allocations, portfolio risk management and the diversification advantages of these markets.

Keywords: CVaR, Dow Jones Islamic index, GJR-GARCH-EVT-pair copula, portfolio optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 960
8540 Multi-Dimensional Concerns Mining for Web Applications via Concept-Analysis

Authors: Carlo Bellettini, Alessandro Marchetto, Andrea Trentini

Abstract:

Web applications have become very complex and crucial, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering), the scientific community has focused attention to Web applications design, development, analysis, and testing, by studying and proposing methodologies and tools. This paper proposes an approach to automatic multi-dimensional concern mining for Web Applications, based on concepts analysis, impact analysis, and token-based concern identification. This approach lets the user to analyse and traverse Web software relevant to a particular concern (concept, goal, purpose, etc.) via multi-dimensional separation of concerns, to document, understand and test Web applications. This technique was developed in the context of WAAT (Web Applications Analysis and Testing) project. A semi-automatic tool to support this technique is currently under development.

Keywords: Concepts Analysis, Concerns Mining, Multi-Dimensional Separation of Concerns, Impact Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1433
8539 The Performance Analysis of Error Saturation Nonlinearity LMS in Impulsive Noise based on Weighted-Energy Conservation

Authors: T Panigrahi, G Panda, Mulgrew

Abstract:

This paper introduces a new approach for the performance analysis of adaptive filter with error saturation nonlinearity in the presence of impulsive noise. The performance analysis of adaptive filters includes both transient analysis which shows that how fast a filter learns and the steady-state analysis gives how well a filter learns. The recursive expressions for mean-square deviation(MSD) and excess mean-square error(EMSE) are derived based on weighted energy conservation arguments which provide the transient behavior of the adaptive algorithm. The steady-state analysis for co-related input regressor data is analyzed, so this approach leads to a new performance results without restricting the input regression data to be white.

Keywords: Error saturation nonlinearity, transient analysis, impulsive noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1746
8538 2D Graphical Analysis of Wastewater Influent Capacity Time Series

Authors: Monika Chuchro, Maciej Dwornik

Abstract:

The extraction of meaningful information from image could be an alternative method for time series analysis. In this paper, we propose a graphical analysis of time series grouped into table with adjusted colour scale for numerical values. The advantages of this method are also discussed. The proposed method is easy to understand and is flexible to implement the standard methods of pattern recognition and verification, especially for noisy environmental data.

Keywords: graphical analysis, time series, seasonality, noisy environmental data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1415
8537 Comparative Study of the Static and Dynamic Analysis of Multi-Storey Irregular Building

Authors: Bahador Bagheri, Ehsan Salimi Firoozabad, Mohammadreza Yahyaei

Abstract:

As the world move to the accomplishment of Performance Based Engineering philosophies in seismic design of Civil Engineering structures, new seismic design provisions require Structural Engineers to perform both static and dynamic analysis for the design of structures. While Linear Equivalent Static Analysis is performed for regular buildings up to 90m height in zone I and II, Dynamic Analysis should be performed for regular and irregular buildings in zone IV and V. Dynamic Analysis can take the form of a dynamic Time History Analysis or a linear Response Spectrum Analysis. In present study, Multi-storey irregular buildings with 20 stories have been modeled using software packages ETABS and SAP 2000 v.15 for seismic zone V in India. This paper also deals with the effect of the variation of the building height on the structural response of the shear wall building. Dynamic responses of building under actual earthquakes, EL-CENTRO 1949 and CHI-CHI Taiwan 1999 have been investigated. This paper highlights the accuracy and exactness of Time History analysis in comparison with the most commonly adopted Response Spectrum Analysis and Equivalent Static Analysis.

Keywords: Equivalent Static Analysis, Time history method, Response spectrum method, Reinforce concrete building, displacement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16086
8536 Economic Factorial Analysis of CO2 Emissions: The Divisia Index with Interconnected Factors Approach

Authors: Alexander Y. Vaninsky

Abstract:

This paper presents a method of economic factorial analysis of the CO2 emissions based on the extension of the Divisia index to interconnected factors. This approach, contrary to the Kaya identity, considers three main factors of the CO2 emissions: gross domestic product, energy consumption, and population - as equally important, and allows for accounting of all of them simultaneously. The three factors are included into analysis together with their carbon intensities that allows for obtaining a comprehensive picture of the change in the CO2 emissions. A computer program in R-language that is available for free download serves automation of the calculations. A case study of the U.S. carbon dioxide emissions is used as an example. 

Keywords: CO2 emissions, Economic analysis, Factorial analysis, Divisia index, Interconnected factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2480
8535 Validation of Reverse Engineered Web Application Models

Authors: Carlo Bellettini, Alessandro Marchetto, Andrea Trentini

Abstract:

Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.

Keywords: Validation, Dynamic Analysis, MutationAnalysis, Reverse Engineering, Web Applications

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1582
8534 Correlational Analysis between Brain Dominances and Multiple Intelligences

Authors: Lakshmi Dhandabani, Rajeev Sukumaran

Abstract:

Aim of this research study is to investigate and establish the characteristics of brain dominances (BD) and multiple intelligences (MI). This experimentation has been conducted for the sample size of 552 undergraduate computer-engineering students. In addition, mathematical formulation has been established to exhibit the relation between thinking and intelligence, and its correlation has been analyzed. Correlation analysis has been statistically measured using Pearson’s coefficient. Analysis of the results proves that there is a strong relational existence between thinking and intelligence. This research is carried to improve the didactic methods in engineering learning and also to improve e-learning strategies.

Keywords: Thinking style assessment, correlational analysis, mathematical model, data analysis, dynamic equilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1845
8533 Determine of Constant Coefficients to RelateTotal Dissolved Solids to Electrical Conductivity

Authors: M. Siosemarde, F. Kave, E. Pazira, H. Sedghi, S. J. Ghaderi

Abstract:

Salinity is a measure of the amount of salts in the water. Total Dissolved Solids (TDS) as salinity parameter are often determined using laborious and time consuming laboratory tests, but it may be more appropriate and economical to develop a method which uses a more simple soil salinity index. Because dissolved ions increase salinity as well as conductivity, the two measures are related. The aim of this research was determine of constant coefficients for predicting of Total Dissolved Solids (TDS) based on Electrical Conductivity (EC) with Statistics of Correlation coefficient, Root mean square error, Maximum error, Mean Bias error, Mean absolute error, Relative error and Coefficient of residual mass. For this purpose, two experimental areas (S1, S2) of Khuzestan province-IRAN were selected and four treatments with three replications by series of double rings were applied. The treatments were included 25cm, 50cm, 75cm and 100cm water application. The results showed the values 16.3 & 12.4 were the best constant coefficients for predicting of Total Dissolved Solids (TDS) based on EC in Pilot S1 and S2 with correlation coefficient 0.977 & 0.997 and 191.1 & 106.1 Root mean square errors (RMSE) respectively.

Keywords: constant coefficients, electrical conductivity, Khuzestan plain and total dissolved solids.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3854
8532 Optimization of Pretreatment and Enzymatic Saccharification of Cogon Grass Prior Ethanol Production

Authors: Jhalique Jane R. Fojas, Ernesto J. Del Rosario

Abstract:

The dilute acid pretreatment and enzymatic saccharification of lignocellulosic substrate, cogon grass (Imperata cylindrical, L.) was optimized prior ethanol fermentation using simultaneous saccharification and fermentation (SSF) method. The optimum pretreatment conditions, temperature, sulfuric acid concentration, and reaction time were evaluated by determining the maximum sugar yield at constant enzyme loading. Cogon grass, at 10% w/v substrate loading, has optimum pretreatment conditions of 126°C, 0.6% v/v H2SO4, and 20min reaction time. These pretreatment conditions were used to optimize enzymatic saccharification using different enzyme combinations. The maximum saccharification yield of 36.68mg/mL (71.29% reducing sugar) was obtained using 25FPU/g-cellulose cellulase complex combined with 1.1% w/w of cellobiase, ß-glucosidase, and 0.225% w/w of hemicellulase complex, after 96 hours of saccharification. Using the optimum pretreatment and saccharification conditions, SSF of treated substrates was done at 37°C for 120 hours using industrial yeast strain HBY3, Saccharomyces cerevisiae. The ethanol yield for cogon grass at 4% w/w loading was 9.11g/L with 5.74mg/mL total residual sugar.

Keywords: Acid pretreatment, bioethanol, biomass, cogon grass, fermentation, lignocellylose, SSF.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3855
8531 Model Solutions for Performance-Based Seismic Analysis of an Anchored Sheet Pile Quay Wall

Authors: C. J. W. Habets, D. J. Peters, J. G. de Gijt, A. V. Metrikine, S. N. Jonkman

Abstract:

Conventional seismic designs of quay walls in ports are mostly based on pseudo-static analysis. A more advanced alternative is the Performance-Based Design (PBD) method, which evaluates permanent deformations and amounts of (repairable) damage under seismic loading. The aim of this study is to investigate the suitability of this method for anchored sheet pile quay walls that were not purposely designed for seismic loads. A research methodology is developed in which pseudo-static, permanent-displacement and finite element analysis are employed, calibrated with an experimental reference case that considers a typical anchored sheet pile wall. A reduction factor that accounts for deformation behaviour is determined for pseudo-static analysis. A model to apply traditional permanent displacement analysis on anchored sheet pile walls is proposed. Dynamic analysis is successfully carried out. From the research it is concluded that PBD evaluation can effectively be used for seismic analysis and design of this type of structure.

Keywords: Anchored sheet pile quay wall, simplified dynamic analysis, performance-based design, pseudo-static analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2328
8530 The Influence of National Culture on Consumer Buying Behaviour: An Exploratory Study of Nigerian and British Consumers

Authors: Mohamed Haffar, Lombe Ngome Enongene, Mohammed Hamdan, Gbolahan Gbadamosi

Abstract:

Despite the considerable body of literature investigating the influence of National Culture (NC) dimensions on consumer behaviour, there is a lack of studies comparing the influence of NC in Africa with Western European countries. This study is intended to fill the vacuum in knowledge by exploring how NC affects consumer buyer behavior in Nigeria and the United Kingdom. The primary data were collected through in depth, semi-structured interviews conducted with three groups of individuals: British students, Nigerian students in the United Kingdom, and Nigerian-based students. This approach and new frontier to analyze culture and consumer behaviour could help understand residual cultural threads of people (that are ingrained in their being) irrespective of exposure to other cultures. The findings of this study show that Nigerian and British consumers differ remarkably in cultural orientations such as symbols, values and psychological standpoints. This ultimately affects the choices made at every stage of the decision building process, and proves beneficial for international retail marketing.

Keywords: National culture, consumer behaviour, international business, Nigeria, UK.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2557
8529 Reversible Watermarking on Stereo Image Sequences

Authors: John N. Ellinas

Abstract:

In this paper, a new reversible watermarking method is presented that reduces the size of a stereoscopic image sequence while keeping its content visible. The proposed technique embeds the residuals of the right frames to the corresponding frames of the left sequence, halving the total capacity. The residual frames may result in after a disparity compensated procedure between the two video streams or by a joint motion and disparity compensation. The residuals are usually lossy compressed before embedding because of the limited embedding capacity of the left frames. The watermarked frames are visible at a high quality and at any instant the stereoscopic video may be recovered by an inverse process. In fact, the left frames may be exactly recovered whereas the right ones are slightly distorted as the residuals are not embedded intact. The employed embedding method reorders the left frame into an array of consecutive pixel pairs and embeds a number of bits according to their intensity difference. In this way, it hides a number of bits in intensity smooth areas and most of the data in textured areas where resulting distortions are less visible. The experimental evaluation demonstrates that the proposed scheme is quite effective.

Keywords: Stereoscopic video, Reversible watermarking, Disparity compensation, Joint compensation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1394
8528 Analysis of Medical Data using Data Mining and Formal Concept Analysis

Authors: Anamika Gupta, Naveen Kumar, Vasudha Bhatnagar

Abstract:

This paper focuses on analyzing medical diagnostic data using classification rules in data mining and context reduction in formal concept analysis. It helps in finding redundancies among the various medical examination tests used in diagnosis of a disease. Classification rules have been derived from positive and negative association rules using the Concept lattice structure of the Formal Concept Analysis. Context reduction technique given in Formal Concept Analysis along with classification rules has been used to find redundancies among the various medical examination tests. Also it finds out whether expensive medical tests can be replaced by some cheaper tests.

Keywords: Data Mining, Formal Concept Analysis, Medical Data, Negative Classification Rules.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1691