Search results for: single fundamental layer.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3002

Search results for: single fundamental layer.

362 Minimizing the Drilling-Induced Damage in Fiber Reinforced Polymeric Composites

Authors: S. D. El Wakil, M. Pladsen

Abstract:

Fiber reinforced polymeric (FRP) composites are finding wide-spread industrial applications because of their exceptionally high specific strength and specific modulus of elasticity. Nevertheless, it is very seldom to get ready-for-use components or products made of FRP composites. Secondary processing by machining, particularly drilling, is almost always required to make holes for fastening components together to produce assemblies. That creates problems since the FRP composites are neither homogeneous nor isotropic. Some of the problems that are encountered include the subsequent damage in the region around the drilled hole and the drilling – induced delamination of the layer of ply, that occurs both at the entrance and the exit planes of the work piece. Evidently, the functionality of the work piece would be detrimentally affected. The current work was carried out with the aim of eliminating or at least minimizing the work piece damage associated with drilling of FPR composites. Each test specimen involves a woven reinforced graphite fiber/epoxy composite having a thickness of 12.5 mm (0.5 inch). A large number of test specimens were subjected to drilling operations with different combinations of feed rates and cutting speeds. The drilling induced damage was taken as the absolute value of the difference between the drilled hole diameter and the nominal one taken as a percentage of the nominal diameter. The later was determined for each combination of feed rate and cutting speed, and a matrix comprising those values was established, where the columns indicate varying feed rate while and rows indicate varying cutting speeds. Next, the analysis of variance (ANOVA) approach was employed using Minitab software, in order to obtain the combination that would improve the drilling induced damage. Experimental results show that low feed rates coupled with low cutting speeds yielded the best results.

Keywords: Drilling of Composites, dimensional accuracy of holes drilled in composites, delamination and charring, graphite-epoxy composites.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 778
361 Partnering with Stakeholders to Secure Digitization of Water

Authors: Sindhu Govardhan, Kenneth G. Crowther

Abstract:

Modernisation of the water sector is leading to increased connectivity and integration of emerging technologies with traditional ones, leading to new security risks. The convergence of Information Technology (IT) with Operation Technology (OT) results in solutions that are spread across larger geographic areas, increasingly consist of interconnected Industrial Internet of Things (IIOT) devices and software, rely on the integration of legacy with modern technologies, use of complex supply chain components leading to complex architectures and communication paths. The result is that multiple parties collectively own and operate these emergent technologies, threat actors find new paths to exploit, and traditional cybersecurity controls are inadequate. Our approach is to explicitly identify and draw data flows that cross trust boundaries between owners and operators of various aspects of these emerging and interconnected technologies. On these data flows, we layer potential attack vectors to create a frame of reference for evaluating possible risks against connected technologies. Finally, we identify where existing controls, mitigations, and other remediations exist across industry partners (e.g., suppliers, product vendors, integrators, water utilities, and regulators). From these, we are able to understand potential gaps in security, the roles in the supply chain that are most likely to effectively remediate those security gaps, and test cases to evaluate and strengthen security across these partners. This informs a “shared responsibility” solution that recognises that security is multi-layered and requires collaboration to be successful. This shared responsibility security framework improves visibility, understanding, and control across the entire supply chain, and particularly for those water utilities that are accountable for safe and continuous operations.

Keywords: Cyber security, shared responsibility, IIOT, threat modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 78
360 Exergetic Optimization on Solid Oxide Fuel Cell Systems

Authors: George N. Prodromidis, Frank A. Coutelieris

Abstract:

Biogas can be currently considered as an alternative option for electricity production, mainly due to its high energy content (hydrocarbon-rich source), its renewable status and its relatively low utilization cost. Solid Oxide Fuel Cell (SOFC) stacks convert fuel’s chemical energy to electricity with high efficiencies and reveal significant advantages on fuel flexibility combined with lower emissions rate, especially when utilize biogas. Electricity production by biogas constitutes a composite problem which incorporates an extensive parametric analysis on numerous dynamic variables. The main scope of the presented study is to propose a detailed thermodynamic model on the optimization of SOFC-based power plants’ operation based on fundamental thermodynamics, energy and exergy balances. This model named THERMAS (THERmodynamic MAthematical Simulation model) incorporates each individual process, during electricity production, mathematically simulated for different case studies that represent real life operational conditions. Also, THERMAS offers the opportunity to choose a great variety of different values for each operational parameter individually, thus allowing for studies within unexplored and experimentally impossible operational ranges. Finally, THERMAS innovatively incorporates a specific criterion concluded by the extensive energy analysis to identify the most optimal scenario per simulated system in exergy terms. Therefore, several dynamical parameters as well as several biogas mixture compositions have been taken into account, to cover all the possible incidents. Towards the optimization process in terms of an innovative OPF (OPtimization Factor), presented here, this research study reveals that systems supplied by low methane fuels can be comparable to these supplied by pure methane. To conclude, such an innovative simulation model indicates a perspective on the optimal design of a SOFC stack based system, in the direction of the commercialization of systems utilizing biogas.

Keywords: Biogas, Exergy, Optimization, SOFC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1168
359 Effects of Cerium Oxide Nanoparticle Addition in Diesel and Diesel-Biodiesel Blends on the Performance Characteristics of a CI Engine

Authors: Abbas Alli Taghipoor Bafghi, Hosein Bakhoda, Fateme Khodaei Chegeni

Abstract:

An experimental investigation is carried out to establish the performance characteristics of a compression ignition engine while using cerium oxide nanoparticles as additive in neat diesel and diesel-biodiesel blends. In the first phase of the experiments, stability of neat diesel and diesel-biodiesel fuel blends with the addition of cerium oxide nanoparticles is analyzed. After series of experiments, it is found that the blends subjected to high speed blending followed by ultrasonic bath stabilization improves the stability. In the second phase, performance characteristics are studied using the stable fuel blends in a single cylinder four stroke engine coupled with an electrical dynamometer and a data acquisition system. The cerium oxide acts as an oxygen donating catalyst and provides oxygen for combustion. The activation energy of cerium oxide acts to burn off carbon deposits within the engine cylinder at the wall temperature and prevents the deposition of non-polar compounds on the cylinder wall results reduction in HC emissions. The tests revealed that cerium oxide nanoparticles can be used as additive in diesel and diesel-biodiesel blends to improve complete combustion of the fuel significantly.

Keywords: Diesel engine, cerium oxide, diesel-biodiesel blends, nanoparticles.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4777
358 Numerical Analysis of Flow in the Gap between a Simplified Tractor-Trailer Model and Cross Vortex Trap Device

Authors: Terrance Charles, Zhiyin Yang, Yiling Lu

Abstract:

Heavy trucks are aerodynamically inefficient due to their un-streamlined body shapes, leading to more than of 60% engine power being required to overcome the aerodynamics drag at 60 m/hr. There are many aerodynamics drag reduction devices developed and this paper presents a study on a drag reduction device called Cross Vortex Trap Device (CVTD) deployed in the gap between the tractor and the trailer of a simplified tractor-trailer model. Numerical simulations have been carried out at Reynolds number 0.51×106 based on inlet flow velocity and height of the trailer using the Reynolds-Averaged Navier-Stokes (RANS) approach. Three different configurations of CVTD have been studied, ranging from single to three slabs, equally spaced on the front face of the trailer. Flow field around three different configurations of trap device have been analysed and presented. The results show that a maximum of 12.25% drag reduction can be achieved when a triple vortex trap device is used. Detailed flow field analysis along with pressure contours are presented to elucidate the drag reduction mechanisms of CVTD and why the triple vortex trap configuration produces the maximum drag reduction among the three configurations tested.

Keywords: Aerodynamic drag, cross vortex trap device, truck, RANS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 610
357 Navigation of Multiple Mobile Robots using Rule-based-Neuro-Fuzzy Technique

Authors: Saroj Kumar Pradhan, Dayal Ramakrushna Parhi, Anup Kumar Panda

Abstract:

This paper deals with motion planning of multiple mobile robots. Mobile robots working together to achieve several objectives have many advantages over single robot system. However, the planning and coordination between the mobile robots is extremely difficult. In the present investigation rule-based and rulebased- neuro-fuzzy techniques are analyzed for multiple mobile robots navigation in an unknown or partially known environment. The final aims of the robots are to reach some pre-defined goals. Based upon a reference motion, direction; distances between the robots and obstacles; and distances between the robots and targets; different types of rules are taken heuristically and refined later to find the steering angle. The control system combines a repelling influence related to the distance between robots and nearby obstacles and with an attracting influence between the robots and targets. Then a hybrid rule-based-neuro-fuzzy technique is analysed to find the steering angle of the robots. Simulation results show that the proposed rulebased- neuro-fuzzy technique can improve navigation performance in complex and unknown environments compared to this simple rulebased technique.

Keywords: Mobile robots, Navigation, Neuro-fuzzy, Obstacle avoidance, Rule-based, Target seeking

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1759
356 Sustainability and Promotion of Inland Waterway Transportation Projects in Colombia: Case of the Magdalena River

Authors: David Julian Bernal Melgarejo

Abstract:

Inland Waterway Transportation (IWT) is playing an important role in national transport systems, water transportation is considered to be safe, energy efficient and environmentally friendly mode of transport, all benefits of IWT cause national awareness increase, for instance the Colombian government is planning to restore the navigability of the most important river of the country, the Magdalena’s River navigability, embrace waterway transportation in Colombia could strength competitiveness while reduce most of the transport externalities. However, the current situation of the Magdalena is deplorable, the most important river of Colombia has been abandoned for decades and the solution is beyond of a single administrative entity. This paper analyzes the outcomes of the Navigation And Inland Waterway Action and Development in Europe program (NAIADES) as a prospective to develop a similar program in Colombia with similar objectives and guidelines, considering sustainability, guarantying the long-term future results and adaptability of the program. Identifying stakeholders and policy experts, a set of individual interviews were carried out; findings support the idea of lack of integration within governmental institutions and lack of importance in marketing promotion as possible drawbacks on the implementation of IWT projects.

Keywords: Inland waterway transportation, Logistics, Sustainability, Multimodal transport systems, Water transportation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2793
355 Development and Analysis of a Machine to Equally Apply Mineral Fertilizer to Soil on Slopes

Authors: Qurbanov Huseyn Nuraddin

Abstract:

Reliable food supply of the population of a country is one of the main directions of the state's economic policy. Grain growing, which is the basis of agriculture, is important in this area. In the cultivation of cereals on slopes, the application of equal amounts of mineral fertilizers to under the soil before sowing is a very important technological process. The low level of technical equipment in this area prevents producers from providing the country with the necessary quality cereals. Experience in the operation of modern technical means has shown that at present, there is a need to provide an equal amount of fertilizer to under the soil on slopes, fully meeting the agro-technical requirements. No fundamental changes have been made to the industrial machines that fertilize under the soil, and unequal application of fertilizers to under the soil on slopes has been applied. This technological process leads to the destruction of new seedlings and reduced productivity due to intolerance to frost during the winter for the plant planted in the fall. In special climatic conditions, there is an optimal fertilization rate for each agricultural product. The application of fertilizers to the soil is one of the conditions that increase their efficiency in the field. As can be seen, the development of a new technical proposal for fertilizing and plowing the slopes in equal amounts on the slopes, improving the technological and design parameters, taking into account the physical and mechanical properties of fertilizers, is very important. Taking into account the above-mentioned issues, a combined plough was developed in our laboratory. Combined plough carries out pre-sowing technological operation in the cultivation of cereals, providing a smooth equal amount of mineral fertilizers to under the soil on the slopes. Mathematical models of a smooth spreader that evenly distributes fertilizers in the field have been developed. Thus, diagrams and graphs obtained without distribution on the eight partitions of the smooth spreader are constructed under the inclined angles of the slopes. Percentage and productivity of equal distribution in the field were noted by practical and theoretical analysis.

Keywords: Combined plough, mineral fertilizer, equal sowing, fertilizer norm, grain-crops, sowing fertilizer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 329
354 Data-driven Multiscale Tsallis Complexity: Application to EEG Analysis

Authors: Young-Seok Choi

Abstract:

This work proposes a data-driven multiscale based quantitative measures to reveal the underlying complexity of electroencephalogram (EEG), applying to a rodent model of hypoxic-ischemic brain injury and recovery. Motivated by that real EEG recording is nonlinear and non-stationary over different frequencies or scales, there is a need of more suitable approach over the conventional single scale based tools for analyzing the EEG data. Here, we present a new framework of complexity measures considering changing dynamics over multiple oscillatory scales. The proposed multiscale complexity is obtained by calculating entropies of the probability distributions of the intrinsic mode functions extracted by the empirical mode decomposition (EMD) of EEG. To quantify EEG recording of a rat model of hypoxic-ischemic brain injury following cardiac arrest, the multiscale version of Tsallis entropy is examined. To validate the proposed complexity measure, actual EEG recordings from rats (n=9) experiencing 7 min cardiac arrest followed by resuscitation were analyzed. Experimental results demonstrate that the use of the multiscale Tsallis entropy leads to better discrimination of the injury levels and improved correlations with the neurological deficit evaluation after 72 hours after cardiac arrest, thus suggesting an effective metric as a prognostic tool.

Keywords: Electroencephalogram (EEG), multiscale complexity, empirical mode decomposition, Tsallis entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2031
353 Finding Pareto Optimal Front for the Multi-Mode Time, Cost Quality Trade-off in Project Scheduling

Authors: H. Iranmanesh, M. R. Skandari, M. Allahverdiloo

Abstract:

Project managers are the ultimate responsible for the overall characteristics of a project, i.e. they should deliver the project on time with minimum cost and with maximum quality. It is vital for any manager to decide a trade-off between these conflicting objectives and they will be benefited of any scientific decision support tool. Our work will try to determine optimal solutions (rather than a single optimal solution) from which the project manager will select his desirable choice to run the project. In this paper, the problem in project scheduling notated as (1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The problem is multi-objective and the purpose is finding the Pareto optimal front of time, cost and quality of a project (curve:quality,time,cost), whose activities belong to a start to finish activity relationship network (cpm) and they can be done in different possible modes (mu) which are non-continuous or discrete (disc), and each mode has a different cost, time and quality . The project is constrained to a non-renewable resource i.e. money (1,T). Because the problem is NP-Hard, to solve the problem, a meta-heuristic is developed based on a version of genetic algorithm specially adapted to solve multi-objective problems namely FastPGA. A sample project with 30 activities is generated and then solved by the proposed method.

Keywords: FastPGA, Multi-Execution Activity Mode, ParetoOptimality, Project Scheduling, Time-Cost-Quality Trade-Off.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1644
352 The Effect of Forest Fires on Physical Properties and Magnetic Susceptibility of Semi-Arid Soils in North-Eastern, Libya

Authors: G. S. Eldiabani, W. H. G. Hale, C. P. Heron

Abstract:

Forest areas are particularly susceptible to fires, which are often manmade. One of the most fire affected forest regions in the world is the Mediterranean. Libya, in the Mediterranean region, has soils that are considered to be arid except in a small area called Aljabal Alakhdar (Green mountain), which is the geographic area covered by this study. Like other forests in the Mediterranean it has suffered extreme degradation. This is mainly due to people removing fire wood, or sometimes converting forested areas to agricultural use, as well as fires which may alter several soil chemical and physical properties. The purpose of this study was to evaluate the effects of fires on the physical properties of soil of Aljabal Alakhdar forest in the north-east of Libya. The physical properties of soil following fire in two geographic areas have been determined, with those subjected to the fire compared to those in adjacent unburned areas in one coastal and one mountain site. Physical properties studied were: soil particle size (soil texture), soil water content, soil porosity and soil particle density. For the first time in Libyan soils, the effect of burning on the magnetic susceptibility properties of soils was also tested. The results showed that the soils in both study sites, irrespective of burning or depth fell into the category of a silt loam texture, low water content, homogeneity of porosity of the soil profiles, relatively high soil particle density values and there is a much greater value of the soil magnetic susceptibility in the top layer from both sites except for the soil water content and magnetic susceptibility, fire has not had a clear effect on the soils’ physical properties.

Keywords: Aljabal Alakhdar, the coastal site, the mountain site, fire effect, soil particle size, soil water content, soil porosity, soil particle density, soil magnetic susceptibility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2616
351 Development of a Support Tool for Cost and Schedule Integration Managment at Program Level

Authors: H. J. Yang, R. Z. Jin, I. J. Park, C. T. Hyun

Abstract:

There has been gradual progress of late in construction projects, particularly in big-scale megaprojects. Due to the long-term construction period, however, with large-scale budget investment, lack of construction management technologies, and increase in the incomplete elements of project schedule management, a plan to conduct efficient operations and to ensure business safety is required. In particular, as the project management information system (PMIS) is meant for managing a single project centering on the construction phase, there is a limitation in the management of program-scale businesses like megaprojects. Thus, a program management information system (PgMIS) that includes program-level management technologies is needed to manage multiple projects. In this study, a support tool was developed for managing the cost and schedule information occurring in the construction phase, at the program level. In addition, a case study on the developed support tool was conducted to verify the usability of the system. With the use of the developed support tool program, construction managers can monitor the progress of the entire project and of the individual subprojects in real time.

Keywords: Cost∙Schedule integration management, Supporting Tool, UI, WBS, CBS, introduce PgMIS (Program Management Information System), PMIS (Project Management Information System)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1423
350 Ensemble Approach for Predicting Student's Academic Performance

Authors: L. A. Muhammad, M. S. Argungu

Abstract:

Educational data mining (EDM) has recorded substantial considerations. Techniques of data mining in one way or the other have been proposed to dig out out-of-sight knowledge in educational data. The result of the study got assists academic institutions in further enhancing their process of learning and methods of passing knowledge to students. Consequently, the performance of students boasts and the educational products are by no doubt enhanced. This study adopted a student performance prediction model premised on techniques of data mining with Students' Essential Features (SEF). SEF are linked to the learner's interactivity with the e-learning management system. The performance of the student's predictive model is assessed by a set of classifiers, viz. Bayes Network, Logistic Regression, and Reduce Error Pruning Tree (REP). Consequently, ensemble methods of Bagging, Boosting, and Random Forest (RF) are applied to improve the performance of these single classifiers. The study reveals that the result shows a robust affinity between learners' behaviors and their academic attainment. Result from the study shows that the REP Tree and its ensemble record the highest accuracy of 83.33% using SEF. Hence, in terms of the Receiver Operating Curve (ROC), boosting method of REP Tree records 0.903, which is the best. This result further demonstrates the dependability of the proposed model.

Keywords: Ensemble, bagging, Random Forest, boosting, data mining, classifiers, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 687
349 Optimizing of Fuzzy C-Means Clustering Algorithm Using GA

Authors: Mohanad Alata, Mohammad Molhim, Abdullah Ramini

Abstract:

Fuzzy C-means Clustering algorithm (FCM) is a method that is frequently used in pattern recognition. It has the advantage of giving good modeling results in many cases, although, it is not capable of specifying the number of clusters by itself. In FCM algorithm most researchers fix weighting exponent (m) to a conventional value of 2 which might not be the appropriate for all applications. Consequently, the main objective of this paper is to use the subtractive clustering algorithm to provide the optimal number of clusters needed by FCM algorithm by optimizing the parameters of the subtractive clustering algorithm by an iterative search approach and then to find an optimal weighting exponent (m) for the FCM algorithm. In order to get an optimal number of clusters, the iterative search approach is used to find the optimal single-output Sugenotype Fuzzy Inference System (FIS) model by optimizing the parameters of the subtractive clustering algorithm that give minimum least square error between the actual data and the Sugeno fuzzy model. Once the number of clusters is optimized, then two approaches are proposed to optimize the weighting exponent (m) in the FCM algorithm, namely, the iterative search approach and the genetic algorithms. The above mentioned approach is tested on the generated data from the original function and optimal fuzzy models are obtained with minimum error between the real data and the obtained fuzzy models.

Keywords: Fuzzy clustering, Fuzzy C-Means, Genetic Algorithm, Sugeno fuzzy systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3210
348 Kinetic Theory Based CFD Modeling of Particulate Flows in Horizontal Pipes

Authors: Pandaba Patro, Brundaban Patro

Abstract:

The numerical simulation of fully developed gas–solid flow in a horizontal pipe is done using the eulerian-eulerian approach, also known as two fluids modeling as both phases are treated as continuum and inter-penetrating continua. The solid phase stresses are modeled using kinetic theory of granular flow (KTGF). The computed results for velocity profiles and pressure drop are compared with the experimental data. We observe that the convection and diffusion terms in the granular temperature cannot be neglected in gas solid flow simulation along a horizontal pipe. The particle-wall collision and lift also play important role in eulerian modeling. We also investigated the effect of flow parameters like gas velocity, particle properties and particle loading on pressure drop prediction in different pipe diameters. Pressure drop increases with gas velocity and particle loading. The gas velocity has the same effect ((proportional toU2 ) as single phase flow on pressure drop prediction. With respect to particle diameter, pressure drop first increases, reaches a peak and then decreases. The peak is a strong function of pipe bore.

Keywords: CFD, Eulerian modeling, gas solid flow, KTGF.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3144
347 Tailormade Geometric Properties of Chitosan by Gamma Irradiation

Authors: F. Elashhab, L. Sheha, R. Fawzi Elsupikhe, A. E. A. Youssef, R. M. Sheltami, T. Alfazani

Abstract:

Chitosans, CSs, in solution are increasingly used in a range of geometric properties in various academic and industrial sectors, especially in the domain of pharmaceutical and biomedical engineering. In order to provide a tailoring guide of CSs to the applicants, gamma (γ)-irradiation technology and simple viscosity measurements have been used in this study. Accordingly, CS solid discs (0.5 cm thickness and 2.5 cm diameter) were exposed in air to Cobalt-60 (γ)-radiation, at room temperature and constant 50 kGy dose for different periods of exposer time (tγ). Diluted solutions of native and different irradiated CS were then prepared by dissolving 1.25 mg cm-3 of each polymer in 0.1 M NaCl/0.2 M CH3COOH. The single-concentration relative viscosity (ƞr) measurements were employed to obtain their intrinsic viscosity ([ƞ]) values and interrelated parameters, like: the molar mass (Mƞ), hydrodynamic radiuses (RH,ƞ), radius of gyration (RG,ƞ), and second virial coefficient (A2,ƞ) of CSs in the solution. The results show an exponential decrease of ƞr, [ƞ], Mƞ, RH,ƞ and RG,ƞ with increasing tγ. This suggests the influence of random chain-scission of CSs glycosidic bonds, with rate constant kr and kr-1 (lifetime τr ~ 0.017 min-1 and 57.14 min, respectively). The results also show an exponential decrease of A2ƞ with increasing tγ, which can be attributed to the growth of excluded volume effect in CS segments by tγ and, hence, better solution quality. The results are represented in following scaling laws as a tailoring guide to the applicants: RH,ƞ = 6.98 x 10-3 Mr0.65; RG,ƞ = 7.09 x 10-4 Mr0.83; A2,ƞ = 121.03 Mƞ,r-0.19.

Keywords: Gamma irradiation, geometric properties, kinetic model, scaling laws, viscosity measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 337
346 Finding Pareto Optimal Front for the Multi- Mode Time, Cost Quality Trade-off in Project Scheduling

Authors: H. Iranmanesh, M. R. Skandari, M. Allahverdiloo

Abstract:

Project managers are the ultimate responsible for the overall characteristics of a project, i.e. they should deliver the project on time with minimum cost and with maximum quality. It is vital for any manager to decide a trade-off between these conflicting objectives and they will be benefited of any scientific decision support tool. Our work will try to determine optimal solutions (rather than a single optimal solution) from which the project manager will select his desirable choice to run the project. In this paper, the problem in project scheduling notated as (1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The problem is multi-objective and the purpose is finding the Pareto optimal front of time, cost and quality of a project (curve:quality,time,cost), whose activities belong to a start to finish activity relationship network (cpm) and they can be done in different possible modes (mu) which are non-continuous or discrete (disc), and each mode has a different cost, time and quality . The project is constrained to a non-renewable resource i.e. money (1,T). Because the problem is NP-Hard, to solve the problem, a meta-heuristic is developed based on a version of genetic algorithm specially adapted to solve multi-objective problems namely FastPGA. A sample project with 30 activities is generated and then solved by the proposed method.

Keywords: FastPGA, Multi-Execution Activity Mode, Pareto Optimality, Project Scheduling, Time-Cost-Quality Trade-Off.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1771
345 A Robust Approach to the Load Frequency Control Problem with Speed Regulation Uncertainty

Authors: S. Z. Sayed Hassen

Abstract:

The load frequency control problem of power systems has attracted a lot of attention from engineers and researchers over the years. Increasing and quickly changing load demand, coupled with the inclusion of more generators with high variability (solar and wind power generators) on the network are making power systems more difficult to regulate. Frequency changes are unavoidable but regulatory authorities require that these changes remain within a certain bound. Engineers are required to perform the tricky task of adjusting the control system to maintain the frequency within tolerated bounds. It is well known that to minimize frequency variations, a large proportional feedback gain (speed regulation constant) is desirable. However, this improvement in performance using proportional feedback comes about at the expense of a reduced stability margin and also allows some steady-state error. A conventional PI controller is then included as a secondary control loop to drive the steadystate error to zero. In this paper, we propose a robust controller to replace the conventional PI controller which guarantees performance and stability of the power system over the range of variation of the speed regulation constant. Simulation results are shown to validate the superiority of the proposed approach on a simple single-area power system model.

Keywords: Robust control, power system, integral action, minimax LQG control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1888
344 UPFC Supplementary Controller Design Using Real-Coded Genetic Algorithm for Damping Low Frequency Oscillations in Power Systems

Authors: A.K. Baliarsingh, S. Panda, A.K. Mohanty, C. Ardil

Abstract:

This paper presents a systematic approach for designing Unified Power Flow Controller (UPFC) based supplementary damping controllers for damping low frequency oscillations in a single-machine infinite-bus power system. Detailed investigations have been carried out considering the four alternatives UPFC based damping controller namely modulating index of series inverter (mB), modulating index of shunt inverter (mE), phase angle of series inverter (δB ) and phase angle of the shunt inverter (δE ). The design problem of the proposed controllers is formulated as an optimization problem and Real- Coded Genetic Algorithm (RCGA) is employed to optimize damping controller parameters. Simulation results are presented and compared with a conventional method of tuning the damping controller parameters to show the effectiveness and robustness of the proposed design approach.

Keywords: Power System Oscillations, Real-Coded Genetic Algorithm (RCGA), Flexible AC Transmission Systems (FACTS), Unified Power Flow Controller (UPFC), Damping Controller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2052
343 Parametric Approach for Reserve Liability Estimate in Mortgage Insurance

Authors: Rajinder Singh, Ram Valluru

Abstract:

Chain Ladder (CL) method, Expected Loss Ratio (ELR) method and Bornhuetter-Ferguson (BF) method, in addition to more complex transition-rate modeling, are commonly used actuarial reserving methods in general insurance. There is limited published research about their relative performance in the context of Mortgage Insurance (MI). In our experience, these traditional techniques pose unique challenges and do not provide stable claim estimates for medium to longer term liabilities. The relative strengths and weaknesses among various alternative approaches revolve around: stability in the recent loss development pattern, sufficiency and reliability of loss development data, and agreement/disagreement between reported losses to date and ultimate loss estimate. CL method results in volatile reserve estimates, especially for accident periods with little development experience. The ELR method breaks down especially when ultimate loss ratios are not stable and predictable. While the BF method provides a good tradeoff between the loss development approach (CL) and ELR, the approach generates claim development and ultimate reserves that are disconnected from the ever-to-date (ETD) development experience for some accident years that have more development experience. Further, BF is based on subjective a priori assumption. The fundamental shortcoming of these methods is their inability to model exogenous factors, like the economy, which impact various cohorts at the same chronological time but at staggered points along their life-time development. This paper proposes an alternative approach of parametrizing the loss development curve and using logistic regression to generate the ultimate loss estimate for each homogeneous group (accident year or delinquency period). The methodology was tested on an actual MI claim development dataset where various cohorts followed a sigmoidal trend, but levels varied substantially depending upon the economic and operational conditions during the development period spanning over many years. The proposed approach provides the ability to indirectly incorporate such exogenous factors and produce more stable loss forecasts for reserving purposes as compared to the traditional CL and BF methods.

Keywords: Actuarial loss reserving techniques, logistic regression, parametric function, volatility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 376
342 Load Discontinuity in Shock Response and Its Remedies

Authors: Shuenn-Yih Chang, Chiu-Li Huang

Abstract:

It has been shown that a load discontinuity at the end of an impulse will result in an extra impulse and hence an extra amplitude distortion if a step-by-step integration method is employed to yield the shock response. In order to overcome this difficulty, three remedies are proposed to reduce the extra amplitude distortion. The first remedy is to solve the momentum equation of motion instead of the force equation of motion in the step-by-step solution of the shock response, where an external momentum is used in the solution of the momentum equation of motion. Since the external momentum is a resultant of the time integration of external force, the problem of load discontinuity will automatically disappear. The second remedy is to perform a single small time step immediately upon termination of the applied impulse while the other time steps can still be conducted by using the time step determined from general considerations. This is because that the extra impulse caused by a load discontinuity at the end of an impulse is almost linearly proportional to the step size. Finally, the third remedy is to use the average value of the two different values at the integration point of the load discontinuity to replace the use of one of them for loading input. The basic motivation of this remedy originates from the concept of no loading input error associated with the integration point of load discontinuity. The feasibility of the three remedies are analytically explained and numerically illustrated.

Keywords: Dynamic analysis, load discontinuity, shock response, step-by-step integration

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1300
341 A Mathematical Representation for Mechanical Model Assessment: Numerical Model Qualification Method

Authors: Keny Ordaz-Hernandez, Xavier Fischer, Fouad Bennis

Abstract:

This article illustrates a model selection management approach for virtual prototypes in interactive simulations. In those numerical simulations, the virtual prototype and its environment are modelled as a multiagent system, where every entity (prototype,human, etc.) is modelled as an agent. In particular, virtual prototyp ingagents that provide mathematical models of mechanical behaviour inform of computational methods are considered. This work argues that selection of an appropriate model in a changing environment,supported by models? characteristics, can be managed by the deter-mination a priori of specific exploitation and performance measures of virtual prototype models. As different models exist to represent a single phenomenon, it is not always possible to select the best one under all possible circumstances of the environment. Instead the most appropriate shall be selecting according to the use case. The proposed approach consists in identifying relevant metrics or indicators for each group of models (e.g. entity models, global model), formulate their qualification, analyse the performance, and apply the qualification criteria. Then, a model can be selected based on the performance prediction obtained from its qualification. The authors hope that this approach will not only help to inform engineers and researchers about another approach for selecting virtual prototype models, but also assist virtual prototype engineers in the systematic or automatic model selection.

Keywords: Virtual prototype models, domain, qualification criterion, model qualification, model assessment, environmental modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2003
340 A Renovated Cook's Distance Based On The Buckley-James Estimate In Censored Regression

Authors: Nazrina Aziz, Dong Q. Wang

Abstract:

There have been various methods created based on the regression ideas to resolve the problem of data set containing censored observations, i.e. the Buckley-James method, Miller-s method, Cox method, and Koul-Susarla-Van Ryzin estimators. Even though comparison studies show the Buckley-James method performs better than some other methods, it is still rarely used by researchers mainly because of the limited diagnostics analysis developed for the Buckley-James method thus far. Therefore, a diagnostic tool for the Buckley-James method is proposed in this paper. It is called the renovated Cook-s Distance, (RD* i ) and has been developed based on the Cook-s idea. The renovated Cook-s Distance (RD* i ) has advantages (depending on the analyst demand) over (i) the change in the fitted value for a single case, DFIT* i as it measures the influence of case i on all n fitted values Yˆ∗ (not just the fitted value for case i as DFIT* i) (ii) the change in the estimate of the coefficient when the ith case is deleted, DBETA* i since DBETA* i corresponds to the number of variables p so it is usually easier to look at a diagnostic measure such as RD* i since information from p variables can be considered simultaneously. Finally, an example using Stanford Heart Transplant data is provided to illustrate the proposed diagnostic tool.

Keywords: Buckley-James estimators, censored regression, censored data, diagnostic analysis, product-limit estimator, renovated Cook's Distance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1405
339 Agreement Options in Multi-person Decision on Optimizing High-Rise Building Columns

Authors: Christiono Utomo, Arazi Idrus, Madzlan Napiah, Mohd. Faris Khamidi

Abstract:

This paper presents a conceptual model of agreement options for negotiation support in multi-person decision on optimizing high-rise building columns. The decision is complicated since many parties involved in choosing a single alternative from a set of solutions. There are different concern caused by differing preferences, experiences, and background. Such building columns as alternatives are referred to as agreement options which are determined by identifying the possible decision maker group, followed by determining the optimal solution for each group. The group in this paper is based on three-decision makers preferences that are designer, programmer, and construction manager. Decision techniques applied to determine the relative value of the alternative solutions for performing the function. Analytical Hierarchy Process (AHP) was applied for decision process and game theory based agent system for coalition formation. An n-person cooperative game is represented by the set of all players. The proposed coalition formation model enables each agent to select individually its allies or coalition. It further emphasizes the importance of performance evaluation in the design process and value-based decision.

Keywords: Agreement options, coalition, group choice, game theory, building columns selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1598
338 Medical Image Watermark and Tamper Detection Using Constant Correlation Spread Spectrum Watermarking

Authors: Peter U. Eze, P. Udaya, Robin J. Evans

Abstract:

Data hiding can be achieved by Steganography or invisible digital watermarking. For digital watermarking, both accurate retrieval of the embedded watermark and the integrity of the cover image are important. Medical image security in Teleradiology is one of the applications where the embedded patient record needs to be extracted with accuracy as well as the medical image integrity verified. In this research paper, the Constant Correlation Spread Spectrum digital watermarking for medical image tamper detection and accurate embedded watermark retrieval is introduced. In the proposed method, a watermark bit from a patient record is spread in a medical image sub-block such that the correlation of all watermarked sub-blocks with a spreading code, W, would have a constant value, p. The constant correlation p, spreading code, W and the size of the sub-blocks constitute the secret key. Tamper detection is achieved by flagging any sub-block whose correlation value deviates by more than a small value, ℇ, from p. The major features of our new scheme include: (1) Improving watermark detection accuracy for high-pixel depth medical images by reducing the Bit Error Rate (BER) to Zero and (2) block-level tamper detection in a single computational process with simultaneous watermark detection, thereby increasing utility with the same computational cost.

Keywords: Constant correlation, medical image, spread spectrum, tamper detection, watermarking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 939
337 Optimization of Kinematics for Birds and UAVs Using Evolutionary Algorithms

Authors: Mohamed Hamdaoui, Jean-Baptiste Mouret, Stephane Doncieux, Pierre Sagaut

Abstract:

The aim of this work is to present a multi-objective optimization method to find maximum efficiency kinematics for a flapping wing unmanned aerial vehicle. We restrained our study to rectangular wings with the same profile along the span and to harmonic dihedral motion. It is assumed that the birdlike aerial vehicle (whose span and surface area were fixed respectively to 1m and 0.15m2) is in horizontal mechanically balanced motion at fixed speed. We used two flight physics models to describe the vehicle aerodynamic performances, namely DeLaurier-s model, which has been used in many studies dealing with flapping wings, and the model proposed by Dae-Kwan et al. Then, a constrained multi-objective optimization of the propulsive efficiency is performed using a recent evolutionary multi-objective algorithm called є-MOEA. Firstly, we show that feasible solutions (i.e. solutions that fulfil the imposed constraints) can be obtained using Dae-Kwan et al.-s model. Secondly, we highlight that a single objective optimization approach (weighted sum method for example) can also give optimal solutions as good as the multi-objective one which nevertheless offers the advantage of directly generating the set of the best trade-offs. Finally, we show that the DeLaurier-s model does not yield feasible solutions.

Keywords: Flight physics, evolutionary algorithm, optimization, Pareto surface.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1616
336 Network Reconfiguration of Distribution System Using Artificial Bee Colony Algorithm

Authors: S. Ganesh

Abstract:

Power distribution systems typically have tie and sectionalizing switches whose states determine the topological configuration of the network. The aim of network reconfiguration of the distribution network is to minimize the losses for a load arrangement at a particular time. Thus the objective function is to minimize the losses of the network by satisfying the distribution network constraints. The various constraints are radiality, voltage limits and the power balance condition. In this paper the status of the switches is obtained by using Artificial Bee Colony (ABC) algorithm. ABC is based on a particular intelligent behavior of honeybee swarms. ABC is developed based on inspecting the behaviors of real bees to find nectar and sharing the information of food sources to the bees in the hive. The proposed methodology has three stages. In stage one ABC is used to find the tie switches, in stage two the identified tie switches are checked for radiality constraint and if the radilaity constraint is satisfied then the procedure is proceeded to stage three otherwise the process is repeated. In stage three load flow analysis is performed. The process is repeated till the losses are minimized. The ABC is implemented to find the power flow path and the Forward Sweeper algorithm is used to calculate the power flow parameters. The proposed methodology is applied for a 33–bus single feeder distribution network using MATLAB.

Keywords: Artificial Bee Colony (ABC) algorithm, Distribution system, Loss reduction, Network reconfiguration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3766
335 Contaminant Transport in Soil from a Point Source

Authors: S. A. Nta, M. J. Ayotamuno, A. H. Igoni, R. N. Okparanma

Abstract:

The work sought to understand the pattern of movement of contaminant from a continuous point source through soil. The soil used was sandy-loam in texture. The contaminant used was municipal solid waste landfill leachate, introduced as a point source through an entry point located at the center of top layer of the soil tank. Analyses were conducted after maturity periods of 50 and 80 days. The maximum change in chemical concentration was observed on soil samples at a radial distance of 0.25 m. Finite element approximation based model was used to assess the future prediction, management and remediation in the polluted area. The actual field data collected for the case study were used to calibrate the modeling and thus simulated the flow pattern of the pollutants through soil. MATLAB R2015a was used to visualize the flow of pollutant through the soil. Dispersion coefficient at 0.25 and 0.50 m radial distance from the point of application of leachate shows a measure of the spreading of a flowing leachate due to the nature of the soil medium, with its interconnected channels distributed at random in all directions. Surface plots of metals on soil after maturity period of 80 days shows a functional relationship between a designated dependent variable (Y), and two independent variables (X and Z). Comparison of measured and predicted profile transport along the depth after 50 and 80 days of leachate application and end of the experiment shows that there were no much difference between the predicted and measured concentrations as they were all lying close to each other. For the analysis of contaminant transport, finite difference approximation based model was very effective in assessing the future prediction, management and remediation in the polluted area. The experiment gave insight into the most likely pattern of movement of contaminant as a result of continuous percolations of the leachate on soil. This is important for contaminant movement prediction and subsequent remediation of such soils.

Keywords: Contaminant, dispersion, point or leaky source, surface plot, soil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 478
334 Efficient Program Slicing Algorithms for Measuring Functional Cohesion and Parallelism

Authors: Jehad Al Dallal

Abstract:

Program slicing is the task of finding all statements in a program that directly or indirectly influence the value of a variable occurrence. The set of statements that can affect the value of a variable at some point in a program is called a program slice. In several software engineering applications, such as program debugging and measuring program cohesion and parallelism, several slices are computed at different program points. In this paper, algorithms are introduced to compute all backward and forward static slices of a computer program by traversing the program representation graph once. The program representation graph used in this paper is called Program Dependence Graph (PDG). We have conducted an experimental comparison study using 25 software modules to show the effectiveness of the introduced algorithm for computing all backward static slices over single-point slicing approaches in computing the parallelism and functional cohesion of program modules. The effectiveness of the algorithm is measured in terms of time execution and number of traversed PDG edges. The comparison study results indicate that using the introduced algorithm considerably saves the slicing time and effort required to measure module parallelism and functional cohesion.

Keywords: Backward slicing, cohesion measure, forward slicing, parallelism measure, program dependence graph, program slicing, static slicing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1421
333 Integration of Image and Patient Data, Software and International Coding Systems for Use in a Mammography Research Project

Authors: V. Balanica, W. I. D. Rae, M. Caramihai, S. Acho, C. P. Herbst

Abstract:

Mammographic images and data analysis to facilitate modelling or computer aided diagnostic (CAD) software development should best be done using a common database that can handle various mammographic image file formats and relate these to other patient information. This would optimize the use of the data as both primary reporting and enhanced information extraction of research data could be performed from the single dataset. One desired improvement is the integration of DICOM file header information into the database, as an efficient and reliable source of supplementary patient information intrinsically available in the images. The purpose of this paper was to design a suitable database to link and integrate different types of image files and gather common information that can be further used for research purposes. An interface was developed for accessing, adding, updating, modifying and extracting data from the common database, enhancing the future possible application of the data in CAD processing. Technically, future developments envisaged include the creation of an advanced search function to selects image files based on descriptor combinations. Results can be further used for specific CAD processing and other research. Design of a user friendly configuration utility for importing of the required fields from the DICOM files must be done.

Keywords: Database Integration, Mammogram Classification, Tumour Classification, Computer Aided Diagnosis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1912