Search results for: implicit Kirk-type iterative schemes
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1181

Search results for: implicit Kirk-type iterative schemes

551 Optimization of Spatial Light Modulator to Generate Aberration Free Optical Traps

Authors: Deepak K. Gupta, T. R. Ravindran

Abstract:

Holographic Optical Tweezers (HOTs) in general use iterative algorithms such as weighted Gerchberg-Saxton (WGS) to generate multiple traps, which produce traps with 99% uniformity theoretically. But in experiments, it is the phase response of the spatial light modulator (SLM) which ultimately determines the efficiency, uniformity, and quality of the trap spots. In general, SLMs show a nonlinear phase response behavior, and they may even have asymmetric phase modulation depth before and after π. This affects the resolution with which the gray levels are addressed before and after π, leading to a degraded trap performance. We present a method to optimize the SLM for a linear phase response behavior along with a symmetric phase modulation depth around π. Further, we optimize the SLM for its varying phase response over different spatial regions by optimizing the brightness/contrast and gamma of the hologram in different subsections. We show the effect of the optimization on an array of trap spots resulting in improved efficiency and uniformity. We also calculate the spot sharpness metric and trap performance metric and show a tightly focused spot with reduced aberration. The trap performance is compared by calculating the trap stiffness of a trapped particle in a given trap spot before and after aberration correction. The trap stiffness is found to improve by 200% after the optimization.

Keywords: spatial light modulator, optical trapping, aberration, phase modulation

Procedia PDF Downloads 170
550 Finite Difference Modelling of Temperature Distribution around Fire Generated Heat Source in an Enclosure

Authors: A. A. Dare, E. U. Iniegbedion

Abstract:

Industrial furnaces generally involve enclosures of fire typically initiated by the combustion of gases. The fire leads to temperature distribution inside the enclosure. A proper understanding of the temperature and velocity distribution within the enclosure is often required for optimal design and use of the furnace. This study was therefore directed at numerical modeling of temperature distribution inside an enclosure as typical in a furnace. A mathematical model was developed from the conservation of mass, momentum and energy. The stream function-vorticity formulation of the governing equations was solved by an alternating direction implicit (ADI) finite difference technique. The finite difference formulation obtained were then developed into a computer code. This was used to determine the temperature, velocities, stream function and vorticity. The effect of the wall heat conduction was also considered, by assuming a one-dimensional heat flow through the wall. The computer code (MATLAB program) developed was used for the determination of the aforementioned variables. The results obtained showed that the transient temperature distribution assumed a uniform profile which becomes more chaotic with increasing time. The vertical velocity showed increasing turbulent behavior with time, while the horizontal velocity assumed decreasing laminar behavior with time. All of these behaviours were equally reported in the literature. The developed model has provided understanding of heat transfer process in an industrial furnace.

Keywords: heat source, modelling, enclosure, furnace

Procedia PDF Downloads 249
549 Impact of Hard Limited Clipping Crest Factor Reduction Technique on Bit Error Rate in OFDM Based Systems

Authors: Theodore Grosch, Felipe Koji Godinho Hoshino

Abstract:

In wireless communications, 3GPP LTE is one of the solutions to meet the greater transmission data rate demand. One issue inherent to this technology is the PAPR (Peak-to-Average Power Ratio) of OFDM (Orthogonal Frequency Division Multiplexing) modulation. This high PAPR affects the efficiency of power amplifiers. One approach to mitigate this effect is the Crest Factor Reduction (CFR) technique. In this work, we simulate the impact of Hard Limited Clipping Crest Factor Reduction technique on BER (Bit Error Rate) in OFDM based Systems. In general, the results showed that CFR has more effects on higher digital modulation schemes, as expected. More importantly, we show the worst-case degradation due to CFR on QPSK, 16QAM, and 64QAM signals in a linear system. For example, hard clipping of 9 dB results in a 2 dB increase in signal to noise energy at a 1% BER for 64-QAM modulation.

Keywords: bit error rate, crest factor reduction, OFDM, physical layer simulation

Procedia PDF Downloads 354
548 Design Optimization of Chevron Nozzles for Jet Noise Reduction

Authors: E. Manikandan, C. Chilambarasan, M. Sulthan Ariff Rahman, S. Kanagaraj, V. R. Sanal Kumar

Abstract:

The noise regulations around the major airports and rocket launching stations due to the environmental concern have made jet noise a crucial problem in the present day aero-acoustics research. The three main acoustic sources in jet nozzles are aerodynamics noise, noise from craft systems and engine and mechanical noise. Note that the majority of engine noise is due to the jet noise coming out from the exhaust nozzle. The previous studies reveal that the potential of chevron nozzles for aircraft engines noise reduction is promising owing to the fact that the jet noise continues to be the dominant noise component, especially during take-off. In this paper parametric analytical studies have been carried out for optimizing the number of chevron lobes, the lobe length and tip shape, and the level of penetration of the chevrons into the flow over a variety of flow conditions for various aerospace applications. The numerical studies have been carried out using a validated steady 3D density based, SST k-ω turbulence model with enhanced wall functions. In the numerical study, a fully implicit finite volume scheme of the compressible, Navier–Stokes equations is employed. We inferred that the geometry optimization of an environmental friendly chevron nozzle with a suitable number of chevron lobes with aerodynamically efficient tip contours for facilitating silent exit flow will enable a commendable sound reduction without much thrust penalty while comparing with the conventional supersonic nozzles with same area ratio.

Keywords: chevron nozzle, jet acoustic level, jet noise suppression, shape optimization of chevron nozzles

Procedia PDF Downloads 305
547 Random Forest Classification for Population Segmentation

Authors: Regina Chua

Abstract:

To reduce the costs of re-fielding a large survey, a Random Forest classifier was applied to measure the accuracy of classifying individuals into their assigned segments with the fewest possible questions. Given a long survey, one needed to determine the most predictive ten or fewer questions that would accurately assign new individuals to custom segments. Furthermore, the solution needed to be quick in its classification and usable in non-Python environments. In this paper, a supervised Random Forest classifier was modeled on a dataset with 7,000 individuals, 60 questions, and 254 features. The Random Forest consisted of an iterative collection of individual decision trees that result in a predicted segment with robust precision and recall scores compared to a single tree. A random 70-30 stratified sampling for training the algorithm was used, and accuracy trade-offs at different depths for each segment were identified. Ultimately, the Random Forest classifier performed at 87% accuracy at a depth of 10 with 20 instead of 254 features and 10 instead of 60 questions. With an acceptable accuracy in prioritizing feature selection, new tools were developed for non-Python environments: a worksheet with a formulaic version of the algorithm and an embedded function to predict the segment of an individual in real-time. Random Forest was determined to be an optimal classification model by its feature selection, performance, processing speed, and flexible application in other environments.

Keywords: machine learning, supervised learning, data science, random forest, classification, prediction, predictive modeling

Procedia PDF Downloads 84
546 Identification and Force Control of a Two Chambers Pneumatic Soft Actuator

Authors: Najib K. Dankadai, Ahmad 'Athif Mohd Faudzi, Khairuddin Osman, Muhammad Rusydi Muhammad Razif, IIi Najaa Aimi Mohd Nordin

Abstract:

Researches in soft actuators are now growing rapidly because of their adequacy to be applied in sectors like medical, agriculture, biological and welfare. This paper presents system identification (SI) and control of the force generated by a two chambers pneumatic soft actuator (PSA). A force mathematical model for the actuator was identified experimentally using data acquisition card and MATLAB SI toolbox. Two control techniques; a predictive functional control (PFC) and conventional proportional integral and derivative (PID) schemes are proposed and compared based on the identified model for the soft actuator flexible mechanism. Results of this study showed that both of the proposed controllers ensure accurate tracking when the closed loop system was tested with the step, sinusoidal and multi step reference input through MATLAB simulation although the PFC provides a better response than the PID.

Keywords: predictive functional control (PFC), proportional integral and derivative (PID), soft actuator, system identification

Procedia PDF Downloads 312
545 Agro-Forestry Expansion in Middle Gangetic Basin: Adopters' Motivations and Experiences in Bihar, India

Authors: Rakesh Tiwary, D. M. Diwakar, Sandhya Mahapatro

Abstract:

Agro-forestry offers huge opportunities for diversification of agriculture in middle Gangetic Basin of India, particularly in the state of Bihar as the region is identified with traditional & stagnant agriculture, low productivity, high population pressure, rural poverty and lack of agro- industrial development. The region is endowed with favourable agro-climatic, soil & drainage conditions; interestingly, there has been an age old tradition of agro-forestry in the state. However, due to demographic pressures, declining land holdings and other socio- economic factors, agro forestry practices have declined in recent decades. The government of Bihar has initiated a special program for expansion of agro-forestry based on modern practices with an aim to raise income level of farmers, make available raw material for wood based industries and increase green cover in the state. The Agro-forestry Schemes – Poplar & Other Species are the key components of the program being implemented by Department of Environment & Forest, Govt. of Bihar. The paper is based on fieldwork based evaluation study on experiences of implementation of the agro-forestry schemes. Understanding adoption patterns, identification of key motives for practising agro-forestry, experiences of farmers well analysing the barriers in expansion constituted the major themes of the research study. This paper is based on primary as well as secondary data. The primary data consists of beneficiary household survey, Focus Group Discussions among beneficiary communities, dialogue and multi stakeholder meetings and field visit to the sites. The secondary data information was collected and analysed from official records, policy documents and reports. Primary data was collected from about 500 beneficiary households of Muzaffarpur & Saharsa- two populous, large and agriculture dominated districts of middle Gangetic basin of North Bihar. Survey also covers 100 households of non-beneficiaries. Probability Proportionate to Size method was used to determine the number of samples to be covered in different blocks of two districts. Qualitative tools were also implemented to have better insights about key research questions. Present paper discusses socio-economic background of farmers practising agro-forestry; the adoption patterns of agro- forestry (choice of plants, methods of plantation and others); and motivation behind adoption of agro-forestry and the comparative benefits of agro-forestry (vis-a-vis traditional agriculture). Experience of beneficiary farmers with agro-forestry based on government programs & promotional campaigns (in terms of awareness, ease of access, knowhow and others) have been covered in the paper. Different aspects of survival of plants have been closely examined. Non beneficiaries but potential adopters were also interviewed to understand barriers of adoption of agro- forestry. Paper provides policy recommendations and interventions required for effective expansion of the agro- forestry and realisation of its future prospects for agricultural diversification in the region.

Keywords: agro-forestry adoption patterns, farmers’ motivations & experiences, Indian middle Gangetic plains, strategies for expansion

Procedia PDF Downloads 193
544 A Heuristic Based Decomposition Approach for a Hierarchical Production Planning Problem

Authors: Nusrat T. Chowdhury, M. F. Baki, A. Azab

Abstract:

The production planning problem is concerned with specifying the optimal quantities to produce in order to meet the demand for a prespecified planning horizon with the least possible expenditure. Making the right decisions in production planning will affect directly the performance and productivity of a manufacturing firm, which is important for its ability to compete in the market. Therefore, developing and improving solution procedures for production planning problems is very significant. In this paper, we develop a Dantzig-Wolfe decomposition of a multi-item hierarchical production planning problem with capacity constraint and present a column generation approach to solve the problem. The original Mixed Integer Linear Programming model of the problem is decomposed item by item into a master problem and a number of subproblems. The capacity constraint is considered as the linking constraint between the master problem and the subproblems. The subproblems are solved using the dynamic programming approach. We also propose a multi-step iterative capacity allocation heuristic procedure to handle any kind of infeasibility that arises while solving the problem. We compare the computational performance of the developed solution approach against the state-of-the-art heuristic procedure available in the literature. The results show that the proposed heuristic-based decomposition approach improves the solution quality by 20% as compared to the literature.

Keywords: inventory, multi-level capacitated lot-sizing, emission control, setup carryover

Procedia PDF Downloads 132
543 Evaluation Framework for Investments in Rail Infrastructure Projects

Authors: Dimitrios J. Dimitriou, Maria F. Sartzetaki

Abstract:

Transport infrastructures are high-cost, long-term investments that serve as vital foundations for the operation of a region or nation and are essential to a country’s or business’s economic development and prosperity, by improving well-being and generating jobs and income. The development of appropriate financing options is of key importance in the decision making process in order develop viable transport infrastructures. The development of transport infrastructure has increasingly been shifting toward alternative methods of project financing such as Public Private Partnership (PPPs) and hybrid forms. In this paper, a methodological decision-making framework based on the evaluation of the financial viability of transportation infrastructure for different financial schemes is presented. The framework leads to an assessment of the financial viability which can be achieved by performing various financing scenarios analyses. To illustrate the application of the proposed methodology, a case study of rail transport infrastructure financing scenario analysis in Greece is developed.

Keywords: rail transport infrastructure, financial viability, scenario analysis, rail project feasibility

Procedia PDF Downloads 265
542 An Inspection of Two Layer Model of Agency: An fMRI Study

Authors: Keyvan Kashkouli Nejad, Motoaki Sugiura, Atsushi Sato, Takayuki Nozawa, Hyeonjeong Jeong, Sugiko Hanawa , Yuka Kotozaki, Ryuta Kawashima

Abstract:

The perception of agency/control is altered with presence of discrepancies in the environment or mismatch of predictions (of possible results) and actual results the sense of agency might become altered. Synofzik et al. proposed a two layer model of agency: In the first layer, the Feeling of Agency (FoA) is not directly available to awareness; a slight mismatch in the environment/outcome might cause alterations in FoA, while the agent still feels in control. If the discrepancy passes a threshold, it becomes available to consciousness and alters Judgment of Agency (JoA), which is directly available in the person’s awareness. Most experiments so far only investigate subjects rather conscious JoA, while FoA has been neglected. In this experiment we target FoA by using subliminal discrepancies that can not be consciously detectable by the subjects. Here, we explore whether we can detect this two level model in the subjects behavior and then try to map this in their brain activity. To do this, in a fMRI study, we incorporated both consciously detectable mismatching between action and result and also subliminal discrepancies in the environment. Also, unlike previous experiments where subjective questions from the participants mainly trigger the rather conscious JoA, we also tried to measure the rather implicit FoA by asking participants to rate their performance. We compared behavioral results and also brain activation when there were conscious discrepancies and when there were subliminal discrepancies against trials with no discrepancies and against each other. In line with our expectations, conditions with consciously detectable incongruencies triggered lower JoA ratings than conditions without. Also, conditions with any type of discrepancies had lower FoA ratings compared to conditions without. Additionally, we found out that TPJ and angular gyrus in particular to have a role in coding of JoA and also FoA.

Keywords: agency, fMRI, TPJ, two layer model

Procedia PDF Downloads 462
541 Civilization and Violence: Islam, the West, and the Rest

Authors: Imbesat Daudi

Abstract:

One of the most discussed topics of the last century happens to be if Islamic civilization is violent. Many Western intellectuals have promoted the notion that Islamic civilization is violent. Citing 9/11, in which 3000 civilians were killed, they argue that Muslims are prone to violence because Islam promotes violence. However, Muslims reject this notion as nonsense. This topic has not been properly addressed. First, violence of civilizations cannot be proven by citing religious texts, which have been used in discussions over civilizational violence. Secondly, the question of whether Muslims are violent is inappropriate, as there is implicit bias suggesting that Islamic civilization is violent. A proper question should be which civilization is more violent. Third, whether Islamic civilization is indeed violent can only be established if more war-related casualties can be documented within the borders of Islamic civilization than that of their cohorts. This has never been done. Finally, the violent behavior of Muslim countries can be examined by comparing acts of violence committed by Muslim countries with acts of violence of groups of nations belonging to other civilizations by appropriate parameters of violence. Therefore, parameters reflecting group violence have been defined; violent conflicts of various civilizations of the last two centuries were documented, quantified by number of conflicts and number of victims, and compared with each other by following the established principles of statistics. The results show that whereas 80% of genocides and massacres were conducted by Western nations, less than 5% of acts of violence were committed by Muslim countries. Furthermore, the West has the highest incidence (new) and prevalence (new and old) of violent conflicts among all groups of nations. The result is unambiguous and statistically significant. Becoming informed can only be done by a methodical collection of relevant data, objective analysis of data, and unbiased information, a process which this paper follows.

Keywords: Islam and violence, demonization of Muslims, violence and the West, comparison of civilizational violence

Procedia PDF Downloads 41
540 A General Variable Neighborhood Search Algorithm to Minimize Makespan of the Distributed Permutation Flowshop Scheduling Problem

Authors: G. M. Komaki, S. Mobin, E. Teymourian, S. Sheikh

Abstract:

This paper addresses minimizing the makespan of the distributed permutation flow shop scheduling problem. In this problem, there are several parallel identical factories or flowshops each with series of similar machines. Each job should be allocated to one of the factories and all of the operations of the jobs should be performed in the allocated factory. This problem has recently gained attention and due to NP-Hard nature of the problem, metaheuristic algorithms have been proposed to tackle it. Majority of the proposed algorithms require large computational time which is the main drawback. In this study, a general variable neighborhood search algorithm (GVNS) is proposed where several time-saving schemes have been incorporated into it. Also, the GVNS uses the sophisticated method to change the shaking procedure or perturbation depending on the progress of the incumbent solution to prevent stagnation of the search. The performance of the proposed algorithm is compared to the state-of-the-art algorithms based on standard benchmark instances.

Keywords: distributed permutation flow shop, scheduling, makespan, general variable neighborhood search algorithm

Procedia PDF Downloads 344
539 Optimizing Fire Tube Boiler Design for Efficient Saturated Steam Production at 2000kg/h

Authors: Yoftahe Nigussie Worku

Abstract:

This study focused on designing a Fire tube boiler to generate saturated steam with a 2000kg/h capacity at a 12bar design pressure. The primary project goal is to achieve efficient steam production while minimizing costs. This involves selecting suitable materials for component parts, employing cost-effective construction methods, and optimizing various parameters. The analysis phase employs iterative processes and relevant formulas to determine key design parameters. This includes optimizing the diameter of tubes for overall heat transfer coefficient, considering a two-pass configuration due to tube and shell size, and using heavy oil fuel no.6 with specific heating values. The designed boiler consumes 140.37kg/hr of fuel, producing 1610kw of heat at an efficiency of 85.25%. The fluid flow is configured as cross flow, leveraging its inherent advantages. The tube arrangement involves welding the tubes inside the shell, which is connected to the tube sheet using a combination of gaskets and welding. The design of the shell adheres to the European Standard code for pressure vessels, accounting for weight and supplementary accessories and providing detailed drawings for components like lifting lugs, openings, ends, manholes, and supports.

Keywords: efficiency, coefficient, saturated steam, fire tube

Procedia PDF Downloads 48
538 Multiparametric Optimization of Water Treatment Process for Thermal Power Plants

Authors: Balgaisha Mukanova, Natalya Glazyrina, Sergey Glazyrin

Abstract:

The formulated problem of optimization of the technological process of water treatment for thermal power plants is considered in this article. The problem is of multiparametric nature. To optimize the process, namely, reduce the amount of waste water, a new technology was developed to reuse such water. A mathematical model of the technology of wastewater reuse was developed. Optimization parameters were determined. The model consists of a material balance equation, an equation describing the kinetics of ion exchange for the non-equilibrium case and an equation for the ion exchange isotherm. The material balance equation includes a nonlinear term that depends on the kinetics of ion exchange. A direct problem of calculating the impurity concentration at the outlet of the water treatment plant was numerically solved. The direct problem was approximated by an implicit point-to-point computation difference scheme. The inverse problem was formulated as relates to determination of the parameters of the mathematical model of the water treatment plant operating in non-equilibrium conditions. The formulated inverse problem was solved. Following the results of calculation the time of start of the filter regeneration process was determined, as well as the period of regeneration process and the amount of regeneration and wash water. Multi-parameter optimization of water treatment process for thermal power plants allowed decreasing the amount of wastewater by 15%.

Keywords: direct problem, multiparametric optimization, optimization parameters, water treatment

Procedia PDF Downloads 378
537 Presuppositions and Implicatures in Four Selected Speeches of Osama Bin Laden's Legitimisation of 'Jihad'

Authors: Sawsan Al-Saaidi, Ghayth K. Shaker Al-Shaibani

Abstract:

This paper investigates certain linguistics properties of four selected speeches by Al-Qaeda’s former leader Osama bin Laden who legitimated the use of jihad by Muslims in various countries when he was alive. The researchers adopt van Dijk’s (2009; 1998) Socio-Cognitive approach and Ideological Square theory respectively. Socio-Cognitive approach revolves around various cognitive, socio-political, and discursive aspects that can be found in political discourse as in Osama bin Laden’s one. The political discourse can be defined in terms of textual properties and contextual models. Pertaining to the ideological square, it refers to positive self-presentation and negative other-presentation which help to enhance the textual and contextual analyses. Therefore, among the most significant properties in Osama bin Laden’s discourse are the use of presuppositions and implicatures which are based on background knowledge and contextual models as well. Thus, the paper concludes that Osama bin Laden used a number of manipulative strategies which augmented and embellished the use of ‘jihad’ in order to develop a more effective discourse for his audience. In addition, the findings have revealed that bin Laden used different implicit and embedded interpretations of different topics which have been accepted as taken-for-granted truths for him to legitimate Jihad against his enemies. There are many presuppositions in the speeches analysed that result in particular common-sense assumptions and a world-view about the selected speeches. More importantly, the assumptions in the analysed speeches help consolidate the ideological analysis in terms of in-group and out-group members.

Keywords: Al-Qaeda, cognition, critical discourse analysis, Osama Bin Laden, jihad, implicature, legitimisation, presupposition, political discourse

Procedia PDF Downloads 227
536 Requirements Management in Agile

Authors: Ravneet Kaur

Abstract:

The concept of Agile Requirements Engineering and Management is not new. However, the struggle to figure out how traditional Requirements Management Process fits within an Agile framework remains complex. This paper talks about a process that can merge the organization’s traditional Requirements Management Process nicely into the Agile Software Development Process. This process provides Traceability of the Product Backlog to the external documents on one hand and User Stories on the other hand. It also gives sufficient evidence that the system will deliver the right functionality with good quality in the form of various statistics and reports. In the nutshell, by overlaying a process on top of Agile, without disturbing the Agility, we are able to get synergic benefits in terms of productivity, profitability, its reporting, and end to end visibility to all Stakeholders. The framework can be used for just-in-time requirements definition or to build a repository of requirements for future use. The goal is to make sure that the business (specifically, the product owner) can clearly articulate what needs to be built and define what is of high quality. To accomplish this, the requirements cycle follows a Scrum-like process that mirrors the development cycle but stays two to three steps ahead. The goal is to create a process by which requirements can be thoroughly vetted, organized, and communicated in a manner that is iterative, timely, and quality-focused. Agile is quickly becoming the most popular way of developing software because it fosters continuous improvement, time-boxed development cycles, and more quickly delivering value to the end users. That value will be driven to a large extent by the quality and clarity of requirements that feed the software development process. An agile, lean, and timely approach to requirements as the starting point will help to ensure that the process is optimized.

Keywords: requirements management, Agile

Procedia PDF Downloads 360
535 Partnerships between Public Administration and Private Social Investment for Territorial Development: Lessons after 15 Brazilian Cases

Authors: Graziela D. de Azevedo, Livia M. Pagotto, Mario P. Monzoni, Neto

Abstract:

This article aims to discuss partnerships between public administration and private social investment aimed at territorial development. There has been some approximation in Brazil from private social investors with initiatives aiming at territorial development policies in highly vulnerable territories or in places where the business sector operates. This represents this paper’s major justification: on the advance of academic debate about how businesses, institutes, and foundations have been working alongside local governments, taking the territory as the reference for joint action. The research was based on the literature on governance and territorial development and adopted a mixed iterative approach (inductive and deductive) through an interpretative lens so as to develop an analysis structure that complements and expands knowledge about the contribution of public policies and private social investments for territorial development in Brazil. The analysis of 15 cases based on three distinct blocks (territorial development plans, articulation for education, and thematic approaches) has made it possible to identify common elements regarding the motivations of partnerships, the specific needs of the actors involved, and the priority drivers for stimulating development. Findings include discussion on the leading role of territories in their development paths, on the institutionalization and strengthening of capacities, and on long-term perspectives in development strategies.

Keywords: private social investment, public administration, territorial governance, territorial development

Procedia PDF Downloads 197
534 A Comparative Analysis of Residential Quality of Public and Private Estates in Lagos

Authors: S. Akinde, Jubril Olatunbosun

Abstract:

In recent years, most of the urban centers in Nigeria are fast experiencing housing problems such as unaffordable housing and environmental challenges, all of which determine the nature of housing quality. The population continues to increase and the demand for quality housing increases probably at the same rate. Several kinds of houses serve various purposes; the objectives of the low cost housing schemes as the name suggests is to make houses quality to both the middle and lower classes of people in Lagos. A casual look into the study area of Iba Low Cost Housing Estate and the Unity Low Cost Housing Estate, Ojo and Alimosho respectively in Lagos State have shown a huge demands for houses. The study area boasts of a large population all engaged in various commercial activities with income at various levels. It would be fair to say that these people are mainly of the middle class and lower class. This means the low cost housing scheme truly serves these purposes. A Low Cost Housing Scheme of Iba which is publicly owned and Low Cost Housing Scheme of Unity Estate (UE) is privately owned.  

Keywords: housing, residential quality, low cost housing scheme, public, private estates

Procedia PDF Downloads 543
533 Representative Concentration Pathways Approach on Wolbachia Controlling Dengue Virus in Aedes aegypti

Authors: Ida Bagus Mandhara Brasika, I Dewa Gde Sathya Deva

Abstract:

Wolbachia is recently developed as the natural enemy of Dengue virus (DENV). It inhibits the replication of DENV in Aedes aegypti. Both DENV and its vector, Aedes aegypty, are sensitive to climate factor especially temperature. The changing of climate has a direct impact on temperature which means changing the vector transmission. Temperature has been known to effect Wolbachia density as it has an ideal temperature to grow. Some scenarios, which are known as Representative Concentration Pathways (RCPs), have been developed by Intergovernmental Panel on Climate Change (IPCC) to predict the future climate based on greenhouse gases concentration. These scenarios are applied to mitigate the future change of Aedes aegypti migration and how Wolbachia could control the virus. The prediction will determine the schemes to release Wolbachia-injected Aedes aegypti to reduce DENV transmission.

Keywords: Aedes aegypti, climate change, dengue virus, Intergovernmental Panel on Climate Change, representative concentration pathways, Wolbachia

Procedia PDF Downloads 293
532 Graph-Oriented Summary for Optimized Resource Description Framework Graphs Streams Processing

Authors: Amadou Fall Dia, Maurras Ulbricht Togbe, Aliou Boly, Zakia Kazi Aoul, Elisabeth Metais

Abstract:

Existing RDF (Resource Description Framework) Stream Processing (RSP) systems allow continuous processing of RDF data issued from different application domains such as weather station measuring phenomena, geolocation, IoT applications, drinking water distribution management, and so on. However, processing window phase often expires before finishing the entire session and RSP systems immediately delete data streams after each processed window. Such mechanism does not allow optimized exploitation of the RDF data streams as the most relevant and pertinent information of the data is often not used in a due time and almost impossible to be exploited for further analyzes. It should be better to keep the most informative part of data within streams while minimizing the memory storage space. In this work, we propose an RDF graph summarization system based on an explicit and implicit expressed needs through three main approaches: (1) an approach for user queries (SPARQL) in order to extract their needs and group them into a more global query, (2) an extension of the closeness centrality measure issued from Social Network Analysis (SNA) to determine the most informative parts of the graph and (3) an RDF graph summarization technique combining extracted user query needs and the extended centrality measure. Experiments and evaluations show efficient results in terms of memory space storage and the most expected approximate query results on summarized graphs compared to the source ones.

Keywords: centrality measures, RDF graphs summary, RDF graphs stream, SPARQL query

Procedia PDF Downloads 183
531 A Family of Second Derivative Methods for Numerical Integration of Stiff Initial Value Problems in Ordinary Differential Equations

Authors: Luke Ukpebor, C. E. Abhulimen

Abstract:

Stiff initial value problems in ordinary differential equations are problems for which a typical solution is rapidly decaying exponentially, and their numerical investigations are very tedious. Conventional numerical integration solvers cannot cope effectively with stiff problems as they lack adequate stability characteristics. In this article, we developed a new family of four-step second derivative exponentially fitted method of order six for the numerical integration of stiff initial value problem of general first order differential equations. In deriving our method, we employed the idea of breaking down the general multi-derivative multistep method into predator and corrector schemes which possess free parameters that allow for automatic fitting into exponential functions. The stability analysis of the method was discussed and the method was implemented with numerical examples. The result shows that the method is A-stable and competes favorably with existing methods in terms of efficiency and accuracy.

Keywords: A-stable, exponentially fitted, four step, predator-corrector, second derivative, stiff initial value problems

Procedia PDF Downloads 244
530 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.

Keywords: integral differential equations, jump–diffusion model, American options, rational approximation

Procedia PDF Downloads 106
529 Using Photo-Elicitation to Explore the Cosmology of Personal Training

Authors: John Gray, Andy Smith, Hazel James

Abstract:

With the introduction of projects such as GP referral and other medical exercise schemes, there has been a shift in the cosmology underpinning exercise leadership. That is, the knowledge base of exercise leaders, specifically personal trainers, has moved from a cosmology based on aesthetic and physical fitness demands to one requiring interaction with the dominant biomedical model underpinning contemporary medicine. In line with this shift research has demonstrated that personal trainer education has aligned itself to a biotechnological model. However, whilst there is a need to examine exercise as medicine, and consider the role of personal trainers as prescribers of these interventions, the possible issues surrounding the growing medicalization of the exercise cosmology have not been explored. Using a phenomenological methodology, and the novel approach of photo-elicitation, this research examined the practices of successful personal trainers. The findings highlight that a growing focus on an iatro-biological based scientific process of exercise prescription may prove problematical. Through the development of a model of practitioner-based knowledge, it is argued there is a possible growing disconnection between the theoretical basis of exercise science and the working cosmology of exercise practitioners.

Keywords: biomedicine, cosmology, personal training, photo-elicitation

Procedia PDF Downloads 370
528 Influence of Intelligence and Failure Mindsets on Parent's Failure Feedback

Authors: Sarah Kalaouze, Maxine Iannucelli, Kristen Dunfield

Abstract:

Children’s implicit beliefs regarding intelligence (i.e., intelligence mindsets) influence their motivation, perseverance, and success. Previous research suggests that the way parents perceive failure influences the development of their child’s intelligence mindsets. We invited 151 children-parent dyads (Age= 5–6 years) to complete a series of difficult puzzles over zoom. We assessed parents’ intelligence and failure mindsets using questionnaires and recorded parents’ person/performance-oriented (e.g., “you are smart” or "you were almost able to complete that one) and process-oriented (e.g., “you are trying really hard” or "maybe if you place the bigger pieces first") failure feedback. We were interested in observing the relation between parental mindsets and the type of feedback provided. We found that parents’ intelligence mindsets were not predictive of the feedback they provided children. Failure mindsets, on the other hand, were predictive of failure feedback. Parents who view failure-as-debilitating provided more person-oriented feedback, focusing on performance and personal ability. Whereas parents who view failure-as-enhancing provided process-oriented feedback, focusing on effort and strategies. Taken all together, our results allow us to determine that although parents might already have a growth intelligence mindset, they don’t necessarily have a failure-as-enhancing mindset. Parents adopting a failure-as-enhancing mindset would influence their children to view failure as a learning opportunity, further promoting practice, effort, and perseverance during challenging tasks. The focus placed on a child’s learning, rather than their performance, encourages them to perceive intelligence as malleable (growth mindset) rather than fix (fixed mindset). This implies that parents should not only hold a growth mindset but thoroughly understand their role in the transmission of intelligence beliefs.

Keywords: mindset(s), failure, intelligence, parental feedback, parents

Procedia PDF Downloads 131
527 Comparison of Computer Software for Swept Path Analysis on Example of Special Paved Areas

Authors: Ivana Cestar, Ivica Stančerić, Saša Ahac, Vesna Dragčević, Tamara Džambas

Abstract:

On special paved areas, such as road intersections, vehicles are usually moving through horizontal curves with smaller radii and occupy considerably greater area compared to open road segments. Planning procedure of these areas is mainly an iterative process that consists of designing project elements, assembling those elements to a design project, and analyzing swept paths for the design vehicle. If applied elements do not fulfill the swept path requirements for the design vehicle, the process must be carried out again. Application of specialized computer software for swept path analysis significantly facilitates planning procedure of special paved areas. There are various software of this kind available on the global market, and each of them has different specifications. In this paper, comparison of two software commonly used in Croatia (Auto TURN and Vehicle Tracking) is presented, their advantages and disadvantages are described, and their applicability on a particular paved area is discussed. In order to reveal which one of the analyszed software is more favorable in terms of swept paths widths, which one includes input parameters that are more relevant for this kind of analysis, and which one is more suitable for the application on a certain special paved area, the analysis shown in this paper was conducted on a number of different intersection types.

Keywords: software comparison, special paved areas, swept path analysis, swept path input parameters

Procedia PDF Downloads 310
526 A Clustering Algorithm for Massive Texts

Authors: Ming Liu, Chong Wu, Bingquan Liu, Lei Chen

Abstract:

Internet users have to face the massive amount of textual data every day. Organizing texts into categories can help users dig the useful information from large-scale text collection. Clustering, in fact, is one of the most promising tools for categorizing texts due to its unsupervised characteristic. Unfortunately, most of traditional clustering algorithms lose their high qualities on large-scale text collection. This situation mainly attributes to the high- dimensional vectors generated from texts. To effectively and efficiently cluster large-scale text collection, this paper proposes a vector reconstruction based clustering algorithm. Only the features that can represent the cluster are preserved in cluster’s representative vector. This algorithm alternately repeats two sub-processes until it converges. One process is partial tuning sub-process, where feature’s weight is fine-tuned by iterative process. To accelerate clustering velocity, an intersection based similarity measurement and its corresponding neuron adjustment function are proposed and implemented in this sub-process. The other process is overall tuning sub-process, where the features are reallocated among different clusters. In this sub-process, the features useless to represent the cluster are removed from cluster’s representative vector. Experimental results on the three text collections (including two small-scale and one large-scale text collections) demonstrate that our algorithm obtains high quality on both small-scale and large-scale text collections.

Keywords: vector reconstruction, large-scale text clustering, partial tuning sub-process, overall tuning sub-process

Procedia PDF Downloads 417
525 Negative Sequence-Based Protection Techniques for Microgrid Connected Power Systems

Authors: Isabelle Snyder, Travis Smith

Abstract:

Microgrid protection presents challenges to conventional protection techniques due to the low-induced fault current. Protection relays present in microgrid applications require a combination of settings groups to adjust based on the architecture of the microgrid in islanded and grid-connected modes. In a radial system where the microgrid is at the other end of the feeder, directional elements can be used to identify the direction of the fault current and switch settings groups accordingly (grid-connected or microgrid-connected). However, with multiple microgrid connections, this concept becomes more challenging, and the direction of the current alone is not sufficient to identify the source of the fault current contribution. ORNL has previously developed adaptive relaying schemes through other DOE-funded research projects that will be evaluated and used as a baseline for this research. The four protection techniques in this study are labeled as follows: (1) Adaptive Current only Protection System (ACPS), Intentional (2) Unbalanced Control for Protection Control (IUCPC), (3) Adaptive Protection System with Communication Controller (APSCC) (4) Adaptive Model-Driven Protective Relay (AMDPR).

Keywords: adaptive relaying, microgrid protection, sequence components, islanding detection

Procedia PDF Downloads 74
524 Routing and Energy Efficiency through Data Coupled Clustering in Large Scale Wireless Sensor Networks (WSNs)

Authors: Jainendra Singh, Zaheeruddin

Abstract:

A typical wireless sensor networks (WSNs) consists of several tiny and low-power sensors which use radio frequency to perform distributed sensing tasks. The longevity of wireless sensor networks (WSNs) is a major issue that impacts the application of such networks. While routing protocols are striving to save energy by acting on sensor nodes, recent studies show that network lifetime can be enhanced by further involving sink mobility. A common approach for energy efficiency is partitioning the network into clusters with correlated data, where the representative nodes simply transmit or average measurements inside the cluster. In this paper, we propose an energy- efficient homogenous clustering (EHC) technique. In this technique, the decision of each sensor is based on their residual energy and an estimate of how many of its neighboring cluster heads (CHs) will benefit from it being a CH. We, also explore the routing algorithm in clustered WSNs. We show that the proposed schemes significantly outperform current approaches in terms of packet delay, hop count and energy consumption of WSNs.

Keywords: wireless sensor network, energy efficiency, clustering, routing

Procedia PDF Downloads 251
523 Numerical Regularization of Ill-Posed Problems via Hybrid Feedback Controls

Authors: Eugene Stepanov, Arkadi Ponossov

Abstract:

Many mathematical models used in biological and other applications are ill-posed. The reason for that is the nature of differential equations, where the nonlinearities are assumed to be step functions, which is done to simplify the analysis. Prominent examples are switched systems arising from gene regulatory networks and neural field equations. This simplification leads, however, to theoretical and numerical complications. In the presentation, it is proposed to apply the theory of hybrid feedback controls to regularize the problem. Roughly speaking, one attaches a finite state control (‘automaton’), which follows the trajectories of the original system and governs its dynamics at the points of ill-posedness. The construction of the automaton is based on the classification of the attractors of the specially designed adjoint dynamical system. This ‘hybridization’ is shown to regularize the original switched system and gives rise to efficient hybrid numerical schemes. Several examples are provided in the presentation, which supports the suggested analysis. The method can be of interest in other applied fields, where differential equations contain step-like nonlinearities.

Keywords: hybrid feedback control, ill-posed problems, singular perturbation analysis, step-like nonlinearities

Procedia PDF Downloads 227
522 Performance Analysis and Comparison of Various 1-D and 2-D Prime Codes for OCDMA Systems

Authors: Gurjit Kaur, Shashank Johri, Arpit Mehrotra

Abstract:

In this paper we have analyzed and compared the performance of various coding schemes. The basic ID prime sequence codes are unique in only dimension i.e. time slots whereas 2D coding techniques are not unique by their time slots but with their wavelengths also. In this research we have evaluated and compared the performance of 1D and 2D coding techniques constructed using prime sequence coding pattern for OCDMA system on a single platform. Results shows that 1D Extended Prime Code (EPC) can support more number of active users compared to other codes but at the expense of larger code length which further increases the complexity of the code. Modified Prime Code (MPC) supports lesser number of active users at λc=2 but it has a lesser code length as compared to 1D prime code. Analysis shows that 2D prime code supports lesser number of active users than 1D codes but they are having large code family and are the most secure codes compared to other codes. The performance of all these codes is analyzed on basis of number of active users supported at a Bit Error Rate (BER) of 10-9.

Keywords: CDMA, OCDMA, BER, OOC, PC, EPC, MPC, 2-D PC/PC, λc, λa

Procedia PDF Downloads 497