Search results for: Charnes Cooper & Rhodes (CCR) Model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7436

Search results for: Charnes Cooper & Rhodes (CCR) Model

476 Multi-Objective Optimization of a Solar-Powered Triple-Effect Absorption Chiller for Air-Conditioning Applications

Authors: Ali Shirazi, Robert A. Taylor, Stephen D. White, Graham L. Morrison

Abstract:

In this paper, a detailed simulation model of a solar-powered triple-effect LiBr–H2O absorption chiller is developed to supply both cooling and heating demand of a large-scale building, aiming to reduce the fossil fuel consumption and greenhouse gas emissions in building sector. TRNSYS 17 is used to simulate the performance of the system over a typical year. A combined energetic-economic-environmental analysis is conducted to determine the system annual primary energy consumption and the total cost, which are considered as two conflicting objectives. A multi-objective optimization of the system is performed using a genetic algorithm to minimize these objectives simultaneously. The optimization results show that the final optimal design of the proposed plant has a solar fraction of 72% and leads to an annual primary energy saving of 0.69 GWh and annual CO2 emissions reduction of ~166 tonnes, as compared to a conventional HVAC system. The economics of this design, however, is not appealing without public funding, which is often the case for many renewable energy systems. The results show that a good funding policy is required in order for these technologies to achieve satisfactory payback periods within the lifetime of the plant.

Keywords: Economic, environmental, multi-objective optimization, solar air-conditioning, triple-effect absorption chiller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1582
475 Two Lessons Learnt in Defining Intersections and Interfaces in Numerical Modeling with Plaxis

Authors: Mahdi Sadeghian, Somaye Sadeghian, Reza Dinarvand

Abstract:

This paper is going to discuss two issues encountered in using PLAXIS. Both issues were monitored during application of PLAXIS to estimate the excavation-induced displacement. Column Soil Mixing (CSM) was applied to stabilise the excavation. It was understood that the estimated excavation induced deformation at the top of the CSM blocks highly depends on the material type defining pavement material adjacent to the CSM blocks. Cohesive material for pavement will result in the unrealistic connection between pavement and CSM even by defining an interface element. To find the most realistic approach, the interface defined in three different manners (1) no interface elements were applied (2) a non-cohesive soil layer was defined between pavement and CSM block to represent the friction between these materials (3) built-in interface elements in PLAXIS was used to define the boundary between the pavement and the CSM block. The result showed that the option 2 would result in more realistic results. The second issue was in the modelling of the contact line between the CSM block and an inclined layer underneath. The analysis result showed that the excavation-induced deformation highly depends on how the PLAXIS user defines the contact area. It was understood that if the contact area had defined as a point in which CSM block had intersected the layer underneath the estimated lateral displacement of CSM block would be unrealistically lower than the model in which the contact area was defined as a line.

Keywords: PLAXIS, FEM, CSM, excavation-induced deformation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 636
474 Financing Decision and Productivity Growth for the Venture Capital Industry Using High-Order Fuzzy Time Series

Authors: Shang-En Yu

Abstract:

Human society, there are many uncertainties, such as economic growth rate forecast of the financial crisis, many scholars have, since the the Song Chissom two scholars in 1993 the concept of the so-called fuzzy time series (Fuzzy Time Series)different mode to deal with these problems, a previous study, however, usually does not consider the relevant variables selected and fuzzy process based solely on subjective opinions the fuzzy semantic discrete, so can not objectively reflect the characteristics of the data set, in addition to carrying outforecasts are often fuzzy rules as equally important, failed to consider the importance of each fuzzy rule. For these reasons, the variable selection (Factor Selection) through self-organizing map (Self-Organizing Map, SOM) and proposed high-end weighted multivariate fuzzy time series model based on fuzzy neural network (Fuzzy-BPN), and using the the sequential weighted average operator (Ordered Weighted Averaging operator, OWA) weighted prediction. Therefore, in order to verify the proposed method, the Taiwan stock exchange (Taiwan Stock Exchange Corporation) Taiwan Weighted Stock Index (Taiwan Stock Exchange Capitalization Weighted Stock Index, TAIEX) as experimental forecast target, in order to filter the appropriate variables in the experiment Finally, included in other studies in recent years mode in conjunction with this study, the results showed that the predictive ability of this study further improve.

Keywords: Heterogeneity, residential mortgage loans, foreclosure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1388
473 Robust Integrated Design for a Mechatronic Feed Drive System of Machine Tools

Authors: Chin-Yin Chen, Chi-Cheng Cheng

Abstract:

This paper aims at to develop a robust optimization methodology for the mechatronic modules of machine tools by considering all important characteristics from all structural and control domains in one single process. The relationship between these two domains is strongly coupled. In order to reduce the disturbance caused by parameters in either one, the mechanical and controller design domains need to be integrated. Therefore, the concurrent integrated design method Design For Control (DFC), will be employed in this paper. In this connect, it is not only applied to achieve minimal power consumption but also enhance structural performance and system response at same time. To investigate the method for integrated optimization, a mechatronic feed drive system of the machine tools is used as a design platform. Pro/Engineer and AnSys are first used to build the 3D model to analyze and design structure parameters such as elastic deformation, nature frequency and component size, based on their effects and sensitivities to the structure. In addition, the robust controller,based on Quantitative Feedback Theory (QFT), will be applied to determine proper control parameters for the controller. Therefore, overall physical properties of the machine tool will be obtained in the initial stage. Finally, the technology of design for control will be carried out to modify the structural and control parameters to achieve overall system performance. Hence, the corresponding productivity is expected to be greatly improved.

Keywords: Machine tools, integrated structure and control design, design for control, multilevel decomposition, quantitative feedback theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1948
472 Hydrogen-Fueled Micro-Thermophotovoltaic Power Generator: Flame Regimes and Flame Stability

Authors: Hosein Faramarzpour

Abstract:

This work presents the optimum operational conditions for a hydrogen-based micro-scale power source, using a verified mathematical model including fluid dynamics and reaction kinetics. Thereafter, the stable operational flame regime is pursued as a key factor in optimizing the design of micro-combustors. The results show that with increasing velocities, four H2 flame regimes develop in the micro-combustor, namely: 1) periodic ignition-extinction regime, 2) steady symmetric regime, 3) pulsating asymmetric regime, and 4) steady asymmetric regime. The first regime that appears in 0.8 m/s inlet velocity is a periodic ignition-extinction regime which is characterized by counter flows and tulip-shape flames. For flow velocity above 0.2 m/s, the flame shifts downstream, and the combustion regime switches to a steady symmetric flame where temperature increases considerably due to the increased rate of incoming energy. Further elevation in flow velocity up to 1 m/s leads to the pulsating asymmetric flame formation, which is associated with pulses in various flame properties such as temperature and species concentration. Further elevation in flow velocity up to 1 m/s leads to the pulsating asymmetric flame formation, which is associated with pulses in various flame properties such as temperature and species concentration. Ultimately, when the inlet velocity reached 1.2 m/s, the last regime was observed, and a steady asymmetric regime appeared.

Keywords: Thermophotovoltaic generator, micro combustor, micro power generator, combustion regimes, flame dynamic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 154
471 Estimation of the Park-Ang Damage Index for Floating Column Building with Infill Wall

Authors: Susanta Banerjee, Sanjaya Kumar Patro

Abstract:

Buildings with floating column are highly undesirable built in seismically active areas. Many urban multi-storey buildings today have floating column buildings which are adopted to accommodate parking at ground floor or reception lobbies in the first storey. The earthquake forces developed at different floor levels in a building need to be brought down along the height to the ground by the shortest path; any deviation or discontinuity in this load transfer path results in poor performance of the building. Floating column buildings are severely damaged during earthquake. Damage on this structure can be reduce by taking the effect of infill wall. This paper presents the effect of stiffness of infill wall to the damage occurred in floating column building when ground shakes. Modelling and analysis are carried out by non linear analysis programme IDARC-2D. Damage occurred in beams, columns, storey are studied by formulating modified Park & Ang model to evaluate damage indices. Overall structural damage indices in buildings due to shaking of ground are also obtained. Dynamic response parameters i.e. lateral floor displacement, storey drift, time period, base shear of buildings are obtained and results are compared with the ordinary moment resisting frame buildings. Formation of cracks, yield, plastic hinge, are also observed during analysis.

Keywords: Floating column, Infill Wall, Park-Ang Damage Index, Damage State.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3114
470 Effects of Process Parameters on the Yield of Oil from Coconut Fruit

Authors: Ndidi F. Amulu, Godian O. Mbah, Maxwel I. Onyiah, Callistus N. Ude

Abstract:

Analysis of the properties of coconut (Cocos nucifera) and its oil was evaluated in this work using standard analytical techniques. The analyses carried out include proximate composition of the fruit, extraction of oil from the fruit using different process parameters and physicochemical analysis of the extracted oil. The results showed the percentage (%) moisture, crude lipid, crude protein, ash and carbohydrate content of the coconut as 7.59, 55.15, 5.65, 7.35 and 19.51 respectively. The oil from the coconut fruit was odourless and yellowish liquid at room temperature (30oC). The treatment combinations used (leaching time, leaching temperature and solute: solvent ratio) showed significant differences (P<0.05) in the yield of oil from coconut flour. The oil yield ranged between 36.25%-49.83%. Lipid indices of the coconut oil indicated the acid value (AV) as 10.05Na0H/g of oil, free fatty acid (FFA) as 5.03%, saponification values (SV) as 183.26mgKOH-1g of oil, iodine value (IV) as 81.00 I2/g of oil, peroxide value (PV) as 5.00 ml/ g of oil and viscosity (V) as 0.002. A standard statistical package minitab version 16.0 program was used in the regression analysis and analysis of variance (ANOVA). The statistical software mentioned above was also used to generate various plots such as single effect plot, interactions effect plot and contour plot. The response or yield of oil from the coconut flour was used to develop a mathematical model that correlates the yield to the process variables studied. The maximum conditions obtained that gave the highest yield of coconut oil were leaching time of 2hrs, leaching temperature of 50oC and solute/solvent ratio of 0.05g/ml.

Keywords: Coconut, oil-extraction, optimization, physicochemical, proximate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2651
469 Environmental Management in Arid Regions:The Question of Water

Authors: Yousef Bakhbakhi, Mourad Boumaza

Abstract:

Only recently have water ethics received focused interest in the international water community. Because water is metabolically basic to life, an ethical dimension persists in every decision related to water. Water ethics at once express human society-s approach to water and act as guidelines for behaviour. Ideas around water are often implicit and embedded as assumptions. They can be entrenched in behaviour and difficult to contest because they are difficult to “see". By explicitly revealing the ethical ideas underlying water-related decisions, human society-s relationship with water, and with natural systems of which water is part, can be contested and shifted or be accepted with conscious intention by human society. In recent decades, improved understanding of water-s importance for ecosystem functioning and ecological services for human survival is moving us beyond this growth-driven, supplyfocused management paradigm. Environmental ethics challenge this paradigm by extending the ethical sphere to the environment and thus water or water Resources management per se. An ethical approach is a legitimate, important, and often ignored approach to effect change in environmental decision making. This qualitative research explores principles of water ethics and examines the underlying ethical precepts of selected water policy examples. The constructed water ethic principles act as a set of criteria against which a policy comparison can be established. This study shows that water Resources management is a progressive issue by embracing full public participation and a new planning model, and knowledgegeneration initiatives.

Keywords: water resources, environmental management, publicparticipation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1453
468 Landfill Failure Mobility Analysis: A Probabilistic Approach

Authors: Ali Jahanfar, Brajesh Dubey, Bahram Gharabaghi, Saber Bayat Movahed

Abstract:

Ever increasing population growth of major urban centers and environmental challenges in siting new landfills have resulted in a growing trend in design of mega-landfills some with extraordinary heights and dangerously steep slopes. Landfill failure mobility risk analysis is one of the most uncertain types of dynamic rheology models due to very large inherent variabilities in the heterogeneous solid waste material shear strength properties. The waste flow of three historic dumpsite and two landfill failures were back-analyzed using run-out modeling with DAN-W model. The travel distances of the waste flow during landfill failures were calculated approach by taking into account variability in material shear strength properties. The probability distribution function for shear strength properties of the waste material were grouped into four major classed based on waste material compaction (landfills versus dumpsites) and composition (high versus low quantity) of high shear strength waste materials such as wood, metal, plastic, paper and cardboard in the waste. This paper presents a probabilistic method for estimation of the spatial extent of waste avalanches, after a potential landfill failure, to create maps of vulnerability scores to inform property owners and residents of the level of the risk.

Keywords: Landfill failure, waste flow, Voellmy rheology, friction coefficient, waste compaction and type.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2286
467 Investigating the Pedestrian Willingness to Pay to Choose Appropriate Policies for Improving the Safety of Pedestrian Facilities

Authors: Babak Mirbaha, Mahmoud Saffarzadeh, Fatemeh Mohajeri

Abstract:

Road traffic accidents lead to a higher rate of death and injury, especially in vulnerable road users such as pedestrians. Improving the safety of facilities for pedestrians is a major concern for policymakers because of the high number of pedestrian fatalities and direct and indirect costs which are imposed to the society. This study focuses on the idea of determining the willingness to pay of pedestrians for increasing their safety while crossing the street. In this study, three different scenarios including crossing the street with zebra crossing facilities, crossing the street with zebra crossing facilities and installing a pedestrian traffic light and constructing a pedestrian bridge with escalator are presented. The research was conducted based on stated preferences method. The required data were collected from a questionnaire that consisted of three parts: pedestrian’s demographic characteristics, travel characteristics and scenarios. Four different payment amounts are presented for each scenario and a logit model has been built for each proposed payment. The results show that sex, age, education, average household income and individual salary have significant effect on choosing a scenario. Among the policies that have been mentioned through the questionnaire scenarios, the scenario of crossing the street with zebra crossing facilities and installing a traffic lights is the most frequent, with willingness to pay 10,000 Rials and the scenario of crossing the street with a zebra crossing with a willingness to pay 100,000 Rials having the least frequency. For all scenarios, as the payment is increasing, the willingness to pay decreases.

Keywords: Pedestrians, willingness to pay, safety, immunization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 830
466 Stability Optimization of Functionally Graded Pipes Conveying Fluid

Authors: Karam Y. Maalawi, Hanan E.M EL-Sayed

Abstract:

This paper presents an exact analytical model for optimizing stability of thin-walled, composite, functionally graded pipes conveying fluid. The critical flow velocity at which divergence occurs is maximized for a specified total structural mass in order to ensure the economic feasibility of the attained optimum designs. The composition of the material of construction is optimized by defining the spatial distribution of volume fractions of the material constituents using piecewise variations along the pipe length. The major aim is to tailor the material distribution in the axial direction so as to avoid the occurrence of divergence instability without the penalty of increasing structural mass. Three types of boundary conditions have been examined; namely, Hinged-Hinged, Clamped- Hinged and Clamped-Clamped pipelines. The resulting optimization problem has been formulated as a nonlinear mathematical programming problem solved by invoking the MatLab optimization toolbox routines, which implement constrained function minimization routine named “fmincon" interacting with the associated eigenvalue problem routines. In fact, the proposed mathematical models have succeeded in maximizing the critical flow velocity without mass penalty and producing efficient and economic designs having enhanced stability characteristics as compared with the baseline designs.

Keywords: Functionally graded materials, pipe flow, optimumdesign, fluid- structure interaction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2208
465 Pushover Analysis of Reinforced Concrete Buildings Using Full Jacket Technics: A Case Study on an Existing Old Building in Madinah

Authors: Tarek M. Alguhane, Ayman H. Khalil, M. N. Fayed, Ayman M. Ismail

Abstract:

The retrofitting of existing buildings to resist the seismic loads is very important to avoid losing lives or financial disasters. The aim at retrofitting processes is increasing total structure strength by increasing stiffness or ductility ratio. In addition, the response modification factors (R) have to satisfy the code requirements for suggested retrofitting types. In this study, two types of jackets are used, i.e. full reinforced concrete jackets and surrounding steel plate jackets. The study is carried out on an existing building in Madinah by performing static pushover analysis before and after retrofitting the columns. The selected model building represents nearly all-typical structure lacks structure built before 30 years ago in Madina City, KSA. The comparison of the results indicates a good enhancement of the structure respect to the applied seismic forces. Also, the response modification factor of the RC building is evaluated for the studied cases before and after retrofitting. The design of all vertical elements (columns) is given. The results show that the design of retrofitted columns satisfied the code's design stress requirements. However, for some retrofitting types, the ductility requirements represented by response modification factor do not satisfy KSA design code (SBC- 301).

Keywords: Concrete jackets, steel jackets, RC buildings pushover analysis, non-linear analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1776
464 Further Development in Predicting Post-Earthquake Fire Ignition Hazard

Authors: Pegah Farshadmanesh, Jamshid Mohammadi, Mehdi Modares

Abstract:

In nearly all earthquakes of the past century that resulted in moderate to significant damage, the occurrence of postearthquake fire ignition (PEFI) has imposed a serious hazard and caused severe damage, especially in urban areas. In order to reduce the loss of life and property caused by post-earthquake fires, there is a crucial need for predictive models to estimate the PEFI risk. The parameters affecting PEFI risk can be categorized as: 1) factors influencing fire ignition in normal (non-earthquake) condition, including floor area, building category, ignitability, type of appliance, and prevention devices, and 2) earthquake related factors contributing to the PEFI risk, including building vulnerability and earthquake characteristics such as intensity, peak ground acceleration, and peak ground velocity. State-of-the-art statistical PEFI risk models are solely based on limited available earthquake data, and therefore they cannot predict the PEFI risk for areas with insufficient earthquake records since such records are needed in estimating the PEFI model parameters. In this paper, the correlation between normal condition ignition risk, peak ground acceleration, and PEFI risk is examined in an effort to offer a means for predicting post-earthquake ignition events. An illustrative example is presented to demonstrate how such correlation can be employed in a seismic area to predict PEFI hazard.

Keywords: Fire risk, post-earthquake fire ignition (PEFI), risk management, seismicity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1361
463 Thermo-Mechanical Approach to Evaluate Softening Behavior of Polystyrene: Validation and Modeling

Authors: Salah Al-Enezi, Rashed Al-Zufairi, Naseer Ahmad

Abstract:

A Thermo-mechanical technique was developed to determine softening point temperature/glass transition temperature (Tg) of polystyrene exposed to high pressures. The design utilizes the ability of carbon dioxide to lower the glass transition temperature of polymers and acts as plasticizer. In this apparatus, the sorption of carbon dioxide to induce softening of polymers as a function of temperature/pressure is performed and the extent of softening is measured in three-point-flexural-bending mode. The polymer strip was placed in the cell in contact with the linear variable differential transformer (LVDT). CO2 was pumped into the cell from a supply cylinder to reach high pressure. The results clearly showed that full softening point of the samples, accompanied by a large deformation on the polymer strip. The deflection curves are initially relatively flat and then undergo a dramatic increase as the temperature is elevated. It was found that increasing the pressure of CO2 causes the temperature curves to shift from higher to lower by increment of about 45 K, over the pressure range of 0-120 bars. The obtained experimental Tg values were validated with the values reported in the literature. Finally, it is concluded that the defection model fits consistently to the generated experimental results, which attempts to describe in more detail how the central deflection of a thin polymer strip affected by the CO2 diffusions in the polymeric samples.

Keywords: Softening, high-pressure, polystyrene, CO2 diffusions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 665
462 Retrospective Reconstruction of Time Series Data for Integrated Waste Management

Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy

Abstract:

The development, operation and maintenance of Integrated Waste Management Systems (IWMS) affects essentially the sustainable concern of every region. The features of such systems have great influence on all of the components of sustainability. In order to reach the optimal way of processes, a comprehensive mapping of the variables affecting the future efficiency of the system is needed such as analysis of the interconnections among the components and modeling of their interactions. The planning of a IWMS is based fundamentally on technical and economical opportunities and the legal framework. Modeling the sustainability and operation effectiveness of a certain IWMS is not in the scope of the present research. The complexity of the systems and the large number of the variables require the utilization of a complex approach to model the outcomes and future risks. This complex method should be able to evaluate the logical framework of the factors composing the system and the interconnections between them. The authors of this paper studied the usability of the Fuzzy Cognitive Map (FCM) approach modeling the future operation of IWMS’s. The approach requires two input data set. One is the connection matrix containing all the factors affecting the system in focus with all the interconnections. The other input data set is the time series, a retrospective reconstruction of the weights and roles of the factors. This paper introduces a novel method to develop time series by content analysis.

Keywords: Content analysis, factors, integrated waste management system, time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2018
461 Allometric Models for Biomass Estimation in Savanna Woodland Area, Niger State, Nigeria

Authors: Abdullahi Jibrin, Aishetu Abdulkadir

Abstract:

The development of allometric models is crucial to accurate forest biomass/carbon stock assessment. The aim of this study was to develop a set of biomass prediction models that will enable the determination of total tree aboveground biomass for savannah woodland area in Niger State, Nigeria. Based on the data collected through biometric measurements of 1816 trees and destructive sampling of 36 trees, five species specific and one site specific models were developed. The sample size was distributed equally between the five most dominant species in the study site (Vitellaria paradoxa, Irvingia gabonensis, Parkia biglobosa, Anogeissus leiocarpus, Pterocarpus erinaceous). Firstly, the equations were developed for five individual species. Secondly these five species were mixed and were used to develop an allometric equation of mixed species. Overall, there was a strong positive relationship between total tree biomass and the stem diameter. The coefficient of determination (R2 values) ranging from 0.93 to 0.99 P < 0.001 were realised for the models; with considerable low standard error of the estimates (SEE) which confirms that the total tree above ground biomass has a significant relationship with the dbh. F-test values for the biomass prediction models were also significant at p < 0.001 which indicates that the biomass prediction models are valid. This study recommends that for improved biomass estimates in the study site, the site specific biomass models should preferably be used instead of using generic models.

Keywords: Allometriy, biomass, carbon stock, model, regression equation, woodland, inventory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2788
460 Statistical Modeling of Local Area Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes

Authors: Jihad S. Daba, J. P. Dubois

Abstract:

Fading noise degrades the performance of cellular communication, most notably in femto- and pico-cells in 3G and 4G systems. When the wireless channel consists of a small number of scattering paths, the statistics of fading noise is not analytically tractable and poses a serious challenge to developing closed canonical forms that can be analysed and used in the design of efficient and optimal receivers. In this context, noise is multiplicative and is referred to as stochastically local fading. In many analytical investigation of multiplicative noise, the exponential or Gamma statistics are invoked. More recent advances by the author of this paper utilized a Poisson modulated-weighted generalized Laguerre polynomials with controlling parameters and uncorrelated noise assumptions. In this paper, we investigate the statistics of multidiversity stochastically local area fading channel when the channel consists of randomly distributed Rayleigh and Rician scattering centers with a coherent Nakagami-distributed line of sight component and an underlying doubly stochastic Poisson process driven by a lognormal intensity. These combined statistics form a unifying triply stochastic filtered marked Poisson point process model.

Keywords: Cellular communication, femto- and pico-cells, stochastically local area fading channel, triply stochastic filtered marked Poisson point process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1345
459 Investigation of the Effect of Impulse Voltage to Flashover by Using Water Jet

Authors: Harun Gülan, Muhsin Tunay Gencoglu, Mehmet Cebeci

Abstract:

The main function of the insulators used in high voltage (HV) transmission lines is to insulate the energized conductor from the pole and hence from the ground. However, when the insulators fail to perform this insulation function due to various effects, failures occur. The deterioration of the insulation results either from breakdown or surface flashover. The surface flashover is caused by the layer of pollution that forms conductivity on the surface of the insulator, such as salt, carbonaceous compounds, rain, moisture, fog, dew, industrial pollution and desert dust. The source of the majority of failures and interruptions in HV lines is surface flashover. This threatens the continuity of supply and causes significant economic losses. Pollution flashover in HV insulators is still a serious problem that has not been fully resolved. In this study, a water jet test system has been established in order to investigate the behavior of insulators under dirty conditions and to determine their flashover performance. Flashover behavior of the insulators is examined by applying impulse voltages in the test system. This study aims to investigate the insulator behaviour under high impulse voltages. For this purpose, a water jet test system was installed and experimental results were obtained over a real system and analyzed. By using the water jet test system instead of the actual insulator, the damage to the insulator as a result of the flashover that would occur under impulse voltage was prevented. The results of the test system performed an important role in determining the insulator behavior and provided predictability.

Keywords: Insulator, pollution flashover, high impulse voltage, water jet model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1247
458 Single Phase 13-Level D-STATCOM Inverter with Distributed System

Authors: R. Kamalakannan, N. Ravi Kumar

Abstract:

The global energy consumption is increasing persistently and need for distributed power generation through renewable energy is essential. To meet the power requirements for consumers without any voltage fluctuations and losses, modeling and design of multilevel inverter with Flexible AC Transmission System (FACTS) capability is presented. The presented inverter is provided with 13-level cascaded H-bridge topology of Insulated Gate Bipolar Transistor (IGBTs) connected along with inbuilt Distributed Static Synchronous Compensators (DSTATCOM). The DSTATCOM device provides control of power factor stability at local feeder lines and the inverter eliminates Total Harmonic Distortion (THD). The 13-level inverter utilizes 52 switches of each H-bridge is fed with single DC sources separately and the Pulse Width Modulation (PWM) technique is used for switching IGBTs. The control strategy implemented for inverter transmits active power to grid as well as it maintains power factor to be stable with achievement of steady state power transmission. Significant outcome of this project is improvement of output voltage quality with steady state power transmission with low THD. Simulation of inverter with DSTATCOM is performed using MATLAB/Simulink environment. The scaled prototype model of proposed inverter is built and its results were validated with simulated results.

Keywords: FACTS devices, distributed-Static synchronous compensators, DSTATCOM, total harmonics elimination, modular multilevel converter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454
457 Design of QFT-Based Self-Tuning Deadbeat Controller

Authors: H. Mansor, S. B. Mohd Noor

Abstract:

This paper presents a design method of self-tuning Quantitative Feedback Theory (QFT) by using improved deadbeat control algorithm. QFT is a technique to achieve robust control with pre-defined specifications whereas deadbeat is an algorithm that could bring the output to steady state with minimum step size. Nevertheless, usually there are large peaks in the deadbeat response. By integrating QFT specifications into deadbeat algorithm, the large peaks could be tolerated. On the other hand, emerging QFT with adaptive element will produce a robust controller with wider coverage of uncertainty. By combining QFT-based deadbeat algorithm and adaptive element, superior controller that is called selftuning QFT-based deadbeat controller could be achieved. The output response that is fast, robust and adaptive is expected. Using a grain dryer plant model as a pilot case-study, the performance of the proposed method has been evaluated and analyzed. Grain drying process is very complex with highly nonlinear behaviour, long delay, affected by environmental changes and affected by disturbances. Performance comparisons have been performed between the proposed self-tuning QFT-based deadbeat, standard QFT and standard dead-beat controllers. The efficiency of the self-tuning QFTbased dead-beat controller has been proven from the tests results in terms of controller’s parameters are updated online, less percentage of overshoot and settling time especially when there are variations in the plant.

Keywords: Deadbeat control, quantitative feedback theory (QFT), robust control, self-tuning control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2333
456 Understanding and Designing Situation-Aware Mobile and Ubiquitous Computing Systems

Authors: Kai Häussermann, Christoph Hubig, Paul Levi, Frank Leymann, Oliver Siemoneit, Matthias Wieland, Oliver Zweigle

Abstract:

Using spatial models as a shared common basis of information about the environment for different kinds of contextaware systems has been a heavily researched topic in the last years. Thereby the research focused on how to create, to update, and to merge spatial models so as to enable highly dynamic, consistent and coherent spatial models at large scale. In this paper however, we want to concentrate on how context-aware applications could use this information so as to adapt their behavior according to the situation they are in. The main idea is to provide the spatial model infrastructure with a situation recognition component based on generic situation templates. A situation template is – as part of a much larger situation template library – an abstract, machinereadable description of a certain basic situation type, which could be used by different applications to evaluate their situation. In this paper, different theoretical and practical issues – technical, ethical and philosophical ones – are discussed important for understanding and developing situation dependent systems based on situation templates. A basic system design is presented which allows for the reasoning with uncertain data using an improved version of a learning algorithm for the automatic adaption of situation templates. Finally, for supporting the development of adaptive applications, we present a new situation-aware adaptation concept based on workflows.

Keywords: context-awareness, ethics, facilitation of system use through workflows, situation recognition and learning based on situation templates and situation ontology's, theory of situationaware systems

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1759
455 Optimizing and Evaluating Performance Quality Control of the Production Process of Disposable Essentials Using Approach Vague Goal Programming

Authors: Hadi Gholizadeh, Ali Tajdin

Abstract:

To have effective production planning, it is necessary to control the quality of processes. This paper aims at improving the performance of the disposable essentials process using statistical quality control and goal programming in a vague environment. That is expressed uncertainty because there is always a measurement error in the real world. Therefore, in this study, the conditions are examined in a vague environment that is a distance-based environment. The disposable essentials process in Kach Company was studied. Statistical control tools were used to characterize the existing process for four factor responses including the average of disposable glasses’ weights, heights, crater diameters, and volumes. Goal programming was then utilized to find the combination of optimal factors setting in a vague environment which is measured to apply uncertainty of the initial information when some of the parameters of the models are vague; also, the fuzzy regression model is used to predict the responses of the four described factors. Optimization results show that the process capability index values for disposable glasses’ average of weights, heights, crater diameters and volumes were improved. Such increasing the quality of the products and reducing the waste, which will reduce the cost of the finished product, and ultimately will bring customer satisfaction, and this satisfaction, will mean increased sales.

Keywords: Goal programming, quality control, vague environment, disposable glasses’ optimization, fuzzy regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1040
454 Software Vulnerability Markets: Discoverers and Buyers

Authors: Abdullah M. Algarni, Yashwant K. Malaiya

Abstract:

Some of the key aspects of vulnerability—discovery, dissemination, and disclosure—have received some attention recently. However, the role of interaction among the vulnerability discoverers and vulnerability acquirers has not yet been adequately addressed. Our study suggests that a major percentage of discoverers, a majority in some cases, are unaffiliated with the software developers and thus are free to disseminate the vulnerabilities they discover in any way they like. As a result, multiple vulnerability markets have emerged. In some of these markets, the exchange is regulated, but in others, there is little or no regulation. In recent vulnerability discovery literature, the vulnerability discoverers have remained anonymous individuals. Although there has been an attempt to model the level of their efforts, information regarding their identities, modes of operation, and what they are doing with the discovered vulnerabilities has not been explored.

Reports of buying and selling of the vulnerabilities are now appearing in the press; however, the existence of such markets requires validation, and the natures of the markets need to be analyzed. To address this need, we have attempted to collect detailed information. We have identified the most prolific vulnerability discoverers throughout the past decade and examined their motivation and methods. A large percentage of these discoverers are located in Eastern and Western Europe and in the Far East. We have contacted several of them in order to collect firsthand information regarding their techniques, motivations, and involvement in the vulnerability markets. We examine why many of the discoverers appear to retire after a highly successful vulnerability-finding career. The paper identifies the actual vulnerability markets, rather than the hypothetical ideal markets that are often examined. The emergence of worldwide government agencies as vulnerability buyers has significant implications. We discuss potential factors that can impact the risk to society and the need for detailed exploration.

Keywords: Risk management, software security, vulnerability discoverers, vulnerability markets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3263
453 A Machine Learning Approach for Earthquake Prediction in Various Zones Based on Solar Activity

Authors: Viacheslav Shkuratskyy, Aminu Bello Usman, Michael O’Dea, Mujeeb Ur Rehman, Saifur Rahman Sabuj

Abstract:

This paper examines relationships between solar activity and earthquakes, it applied machine learning techniques: K-nearest neighbour, support vector regression, random forest regression, and long short-term memory network. Data from the SILSO World Data Center, the NOAA National Center, the GOES satellite, NASA OMNIWeb, and the United States Geological Survey were used for the experiment. The 23rd and 24th solar cycles, daily sunspot number, solar wind velocity, proton density, and proton temperature were all included in the dataset. The study also examined sunspots, solar wind, and solar flares, which all reflect solar activity, and earthquake frequency distribution by magnitude and depth. The findings showed that the long short-term memory network model predicts earthquakes more correctly than the other models applied in the study, and solar activity is more likely to effect earthquakes of lower magnitude and shallow depth than earthquakes of magnitude 5.5 or larger with intermediate depth and deep depth

.

Keywords: K-Nearest Neighbour, Support Vector Regression, Random Forest Regression, Long Short-Term Memory Network, earthquakes, solar activity, sunspot number, solar wind, solar flares.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 202
452 Strategic Thinking to Change Behavior and Improve Sanitation in Jodipan and Kesatrian, Malang, East Java, Indonesia

Authors: Prasanti Widyasih Sarli, Prayatni Soewondo

Abstract:

Greater access to sanitation in developing countries is urgent. However even though sanitation is crucial, overall budget for sanitation is limited. With this budget limitation, it is important to (1) allocate resources strategically to maximize impact and (2) take into account communal agency to potentially be a source for sanitation improvements. The Jodipan and Kesatrian Project in Malang, Indonesia is an interesting alternative for solving the sanitation problem in which resources were allocated strategically and communal agency was also observed. Although the projects initial goal was only to improve visually the situation in the slums, it became a new tourist destination, and the economic benefit that came with it had an effect also on the change of behavior of the residents and the government towards sanitation. It also grew from only including the Kesatrian Village to expanding to the Jodipan Village in the course of less than a year. To investigate the success of this project, in this paper a descriptive model will be used and data will be drawn from intensive interviews with the initiators of the project, residents affected by the project and government officials. In this research it is argued that three points mark the success of the project: (1) the strategic initial impact due to choice of location, (2) the influx of tourists that triggered behavioral change among residents and, (3) the direct economic impact which ensured its sustainability and growth by gaining government officials support and attention for more public spending in the area for slum development and sanitation improvement.

Keywords: Behavior change, sanitation, slum, strategic thinking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 997
451 Classification of Business Models of Italian Bancassurance by Balance Sheet Indicators

Authors: Andrea Bellucci, Martina Tofi

Abstract:

The aim of paper is to analyze business models of bancassurance in Italy for life business. The life insurance business is very developed in the Italian market and banks branches have 80% of the market share. Given its maturity, the life insurance market needs to consolidate its organizational form to allow for the development of non-life business, which nowadays collects few premiums but represents a great opportunity to enlarge the market share of bancassurance using its strength in the distribution channel while the market share of independent agents is decreasing. Starting with the main business model of bancassurance for life business, this paper will analyze the performances of life companies in the Italian market by balance sheet indicators and by main discriminant variables of business models. The study will observe trends from 2013 to 2015 for the Italian market by exploiting a database managed by Associazione Nazionale delle Imprese di Assicurazione (ANIA). The applied approach is based on a bottom-up analysis starting with variables and indicators to define business models’ classification. The statistical classification algorithm proposed by Ward is employed to design business models’ profiles. Results from the analysis will be a representation of the main business models built by their profile related to indicators. In that way, an unsupervised analysis is developed that has the limit of its judgmental dimension based on research opinion, but it is possible to obtain a design of effective business models.

Keywords: Balance sheet indicators, Bancassurance, business models, ward algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1261
450 Low Resolution Face Recognition Using Mixture of Experts

Authors: Fatemeh Behjati Ardakani, Fatemeh Khademian, Abbas Nowzari Dalini, Reza Ebrahimpour

Abstract:

Human activity is a major concern in a wide variety of applications, such as video surveillance, human computer interface and face image database management. Detecting and recognizing faces is a crucial step in these applications. Furthermore, major advancements and initiatives in security applications in the past years have propelled face recognition technology into the spotlight. The performance of existing face recognition systems declines significantly if the resolution of the face image falls below a certain level. This is especially critical in surveillance imagery where often, due to many reasons, only low-resolution video of faces is available. If these low-resolution images are passed to a face recognition system, the performance is usually unacceptable. Hence, resolution plays a key role in face recognition systems. In this paper we introduce a new low resolution face recognition system based on mixture of expert neural networks. In order to produce the low resolution input images we down-sampled the 48 × 48 ORL images to 12 × 12 ones using the nearest neighbor interpolation method and after that applying the bicubic interpolation method yields enhanced images which is given to the Principal Component Analysis feature extractor system. Comparison with some of the most related methods indicates that the proposed novel model yields excellent recognition rate in low resolution face recognition that is the recognition rate of 100% for the training set and 96.5% for the test set.

Keywords: Low resolution face recognition, Multilayered neuralnetwork, Mixture of experts neural network, Principal componentanalysis, Bicubic interpolation, Nearest neighbor interpolation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724
449 Prediction of Compressive Strength of Concrete from Early Age Test Result Using Design of Experiments (RSM)

Authors: Salem Alsanusi, Loubna Bentaher

Abstract:

Response Surface Methods (RSM) provide statistically validated predictive models that can then be manipulated for finding optimal process configurations. Variation transmitted to responses from poorly controlled process factors can be accounted for by the mathematical technique of propagation of error (POE), which facilitates ‘finding the flats’ on the surfaces generated by RSM. The dual response approach to RSM captures the standard deviation of the output as well as the average. It accounts for unknown sources of variation. Dual response plus propagation of error (POE) provides a more useful model of overall response variation. In our case, we implemented this technique in predicting compressive strength of concrete of 28 days in age. Since 28 days is quite time consuming, while it is important to ensure the quality control process. This paper investigates the potential of using design of experiments (DOE-RSM) to predict the compressive strength of concrete at 28th day. Data used for this study was carried out from experiment schemes at university of Benghazi, civil engineering department. A total of 114 sets of data were implemented. ACI mix design method was utilized for the mix design. No admixtures were used, only the main concrete mix constituents such as cement, coarseaggregate, fine aggregate and water were utilized in all mixes. Different mix proportions of the ingredients and different water cement ratio were used. The proposed mathematical models are capable of predicting the required concrete compressive strength of concrete from early ages.

Keywords: Mix proportioning, response surface methodology, compressive strength, optimal design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2214
448 Improving the Shunt Active Power Filter Performance Using Synchronous Reference Frame PI Based Controller with Anti-Windup Scheme

Authors: Consalva J. Msigwa, Beda J. Kundy, Bakari M. M. Mwinyiwiwa

Abstract:

In this paper the reference current for Voltage Source Converter (VSC) of the Shunt Active Power Filter (SAPF) is generated using Synchronous Reference Frame method, incorporating the PI controller with anti-windup scheme. The proposed method improves the harmonic filtering by compensating the winding up phenomenon caused by the integral term of the PI controller. Using Reference Frame Transformation, the current is transformed from om a - b - c stationery frame to rotating 0 - d - q frame. Using the PI controller, the current in the 0 - d - q frame is controlled to get the desired reference signal. A controller with integral action combined with an actuator that becomes saturated can give some undesirable effects. If the control error is so large that the integrator saturates the actuator, the feedback path becomes ineffective because the actuator will remain saturated even if the process output changes. The integrator being an unstable system may then integrate to a very large value, the phenomenon known as integrator windup. Implementing the integrator anti-windup circuit turns off the integrator action when the actuator saturates, hence improving the performance of the SAPF and dynamically compensating harmonics in the power network. In this paper the system performance is examined with Shunt Active Power Filter simulation model.

Keywords: Phase Locked Loop (PLL), Voltage SourceConverter (VSC), Shunt Active Power Filter (SAPF), PI, Pulse WidthModulation (PWM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1567
447 Preparation of Corn Flour Based Extruded Product and Evaluate Its Physical Characteristics

Authors: C. S. Saini

Abstract:

The composite flour blend consisting of corn, pearl millet, black gram and wheat bran in the ratio of 80:5:10:5 was taken to prepare the extruded product and their effect on physical properties of extrudate was studied. The extrusion process was conducted in laboratory by using twin screw extruder. The physical characteristics evaluated include lateral expansion, bulk density, water absorption index, water solubility index, and rehydration ratio and moisture retention. The Central Composite Rotatable Design (CCRD) was used to decide the level of processing variables i.e. feed moisture content (%), screw speed (rpm), and barrel temperature (oC) for the experiment. The data obtained after extrusion process were analyzed by using response surface methodology. A second order polynomial model for the dependent variables was established to fit the experimental data. The numerical optimization studies resulted in 127°C of barrel temperature, 246 rpm of screw speed, and 14.5% of feed moisture as optimum variables to produce acceptable extruded product. The responses predicted by the software for the optimum process condition resulted in lateral expansion 126%, bulk density 0.28 g/cm3, water absorption index 4.10 g/g, water solubility index 39.90%, rehydration ratio 544% and moisture retention 11.90% with 75% desirability.

Keywords: Black gram, corn flour, extrusion, physical characteristics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3311