Search results for: mixed methods approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27870

Search results for: mixed methods approach

22800 A Comparison of Income and Fuzzy Index of Multidimensional Poverty in Fourteen Sub-Saharan African Countries

Authors: Joseph Siani

Abstract:

Over the last decades, dissatisfaction with global indicators of economic performance, such as GDP (Gross Domestic Product) per capita, has shifted the attention to what is now referred to as multidimensional poverty. In this framework, poverty goes beyond income to incorporate aspects of well-being not captured by income measures alone. This paper applies the totally fuzzy approach to estimate the fuzzy index of poverty (FIP) in fourteen Sub-Saharan African (SSA) countries using Demographic and Health Survey (DHS) data and explores whether pictures created by the standard headcount ratio at $1.90 a day and the fuzzy index of poverty tell a similar story. The results suggest that there is indeed considerable mismatch between poverty headcount and the fuzzy index of multidimensional poverty, meaning that the majority of the most deprived people (as identified by the fuzzy index of multidimensional poverty) would not be identified by the poverty headcount ratio. Moreover, we find that poverty is distributed differently by colonial heritage (language). In particular, the most deprived countries in SSA are French-speaking.

Keywords: fuzzy set approach, multidimensional poverty, poverty headcount, overlap, Sub-Saharan Africa

Procedia PDF Downloads 205
22799 Investigating Reservior Sedimentation Control in the Conservation of Water

Authors: Mosupi Ratshaa

Abstract:

Despite years of diligent study, sedimentation is still undoubtedly the most severe technical problem faced by the dam industry. The problem of sedimentation build-up and its removal should be the focus as an approach to remedy this. The world's reservoirs lose about 1% of their storage capacity yearly to sedimentation, what this means is that 1% of water that could be stored is lost the world-over. The increase in population means that the need for water also increases and, therefore, the loss due to sedimentation is of great concern especially to the conservation of water. When it comes to reservoir sedimentation, the thought of water conservation comes with soil conservation since this increasing sediment that takes the volume meant for water is being lost from dry land. For this reason, reservoir sediment control is focused on reducing sediment entering the reservoir and reducing sediment within the reservoir. There are many problems with sediment control such as the difficulty to predict settling patterns, inability to greatly reduce the sediment volume entering the river flow which increases the reservoirs trap efficiency just to mention a few. Notably reservoirs are habitats for flora and fauna, the process of removing sediment from these reservoirs damages this ecosystem so there is an ethical point to be considered in this section. This paper looks at the methods used to control the sedimentation of reservoirs and their effects to the ecosystem in the aim of reducing water losses due to sedimentation. Various control measures which reduce sediment entering the reservoir such as Sabo dams or Check dams along with measures which emphasize the reduction in built-up settled sediment such as flushing will be reviewed all with the prospect of conservation.

Keywords: sedimentation, conservation, ecosystem, flushing

Procedia PDF Downloads 336
22798 Morphological Characteristics and Pollination Requirement in Red Pitaya (Hylocereus Spp.)

Authors: Dinh Ha, Tran, Chung-Ruey Yen

Abstract:

This study explored the morphological characteristics and effects of pollination methods on fruit set and characteristics in four red pitaya (Hylocereus spp.) clones. The distinctive morphological recognition and classification among pitaya clones were confirmed by the stem, flower and fruit features. The fruit production season was indicated from the beginning of May to the end of August, the beginning of September with 6-7 flowering cycles per year. The floral stage took from 15-19 days and fruit duration spent 30–32 days. VN White, fully self-compatible, obtained high fruit set rates (80.0-90.5 %) in all pollination treatments and the maximum fruit weight (402.6 g) in hand self- and (403.4 g) in open-pollination. Chaozhou 5 was partially self-compatible while Orejona and F11 were completely self-incompatible. Hand cross-pollination increased significantly fruit set (95.8; 88.4 and 90.2 %) and fruit weight (374.2; 281.8 and 416.3 g) in Chaozhou 5, Orejona, and F11, respectively. TSS contents were not much influenced by pollination methods.

Keywords: Hylocereus spp., morphology, floral phenology, pollination requirement

Procedia PDF Downloads 304
22797 Effect of Bamboo Chips in Cemented Sand Soil on Permeability and Mechanical Properties in Triaxial Compression

Authors: Sito Ismanti, Noriyuki Yasufuku

Abstract:

Cement utilization to improve the properties of soil is a well-known method applied in field. However, its addition in large quantity must be controlled. This study presents utilization of natural and environmental-friendly material mixed with small amount of cement content in soil improvement, i.e. bamboo chips. Absorbability, elongation, and flatness ratio of bamboo chips were examined to investigate and understand the influence of its characteristics in the mixture. Improvement of dilation behavior as a problem of loose and poorly graded sand soil is discussed. Bamboo chips are able to improve the permeability value that affects the dilation behavior of cemented sand soil. It is proved by the stress path as the result of triaxial compression test in the undrained condition. The effect of size and content variation of bamboo chips, as well as the curing time variation are presented and discussed.  

Keywords: bamboo chips, permeability, mechanical properties, triaxial compression

Procedia PDF Downloads 333
22796 Preference Aggregation and Mechanism Design in the Smart Grid

Authors: Zaid Jamal Saeed Almahmoud

Abstract:

Smart Grid is the vision of the future power system that combines advanced monitoring and communication technologies to provide energy in a smart, efficient, and user-friendly manner. This proposal considers a demand response model in the Smart Grid based on utility maximization. Given a set of consumers with conflicting preferences in terms of consumption and a utility company that aims to minimize the peak demand and match demand to supply, we study the problem of aggregating these preferences while modelling the problem as a game. We also investigate whether an equilibrium can be reached to maximize the social benefit. Based on such equilibrium, we propose a dynamic pricing heuristic that computes the equilibrium and sets the prices accordingly. The developed approach was analysed theoretically and evaluated experimentally using real appliances data. The results show that our proposed approach achieves a substantial reduction in the overall energy consumption.

Keywords: heuristics, smart grid, aggregation, mechanism design, equilibrium

Procedia PDF Downloads 114
22795 Assessment of Fluid Flow Hydrodynamics for Cylindrical and Conical Fluidized Bed Reactor

Authors: N. G. Thangan, A. B. Deoghare, P. M. Padole

Abstract:

Computational Fluid Dynamics (CFD) aids in modeling the prototype of a real world processes. CFD approach is useful in predicting the fluid flow, heat transfer mass transfer and other flow related phenomenon. In present study, hydrodynamic characteristics of gas-solid cylindrical fluidized bed is compared with conical fluidized beds. A 2D fluidized bed consists of different configurations of particle size of iron oxide, bed height and superficial velocities of nitrogen. Simulations are performed to capture the complex physics associated with it. The Eulerian multiphase model is prepared in ANSYS FLUENT v.14 which is used to simulate fluidization process. It is analyzed with nitrogen as primary phase and iron oxide as secondary phase. The bed hydrodynamics is assessed prominently to examine effect on fluidization time, pressure drop, minimum fluidization velocity, and gas holdup in the system.

Keywords: fluidized bed, bed hydrodynamics, Eulerian multiphase approach, computational fluid dynamics

Procedia PDF Downloads 452
22794 The Mechanical Properties of a Small-Size Seismic Isolation Rubber Bearing for Bridges

Authors: Yi F. Wu, Ai Q. Li, Hao Wang

Abstract:

Taking a novel type of bridge bearings with the diameter being 100mm as an example, the theoretical analysis, the experimental research as well as the numerical simulation of the bearing were conducted. Since the normal compression-shear machines cannot be applied to the small-size bearing, an improved device to test the properties of the bearing was proposed and fabricated. Besides, the simulation of the bearing was conducted on the basis of the explicit finite element software ANSYS/LS-DYNA, and some parameters of the bearing are modified in the finite element model to effectively reduce the computation cost. Results show that all the research methods are capable of revealing the fundamental properties of the small-size bearings, and a combined use of these methods can better catch both the integral properties and the inner detailed mechanical behaviors of the bearing.

Keywords: ANSYS/LS-DYNA, compression shear, contact analysis, explicit algorithm, small-size

Procedia PDF Downloads 181
22793 A Review of Data Visualization Best Practices: Lessons for Open Government Data Portals

Authors: Bahareh Ansari

Abstract:

Background: The Open Government Data (OGD) movement in the last decade has encouraged many government organizations around the world to make their data publicly available to advance democratic processes. But current open data platforms have not yet reached to their full potential in supporting all interested parties. To make the data useful and understandable for everyone, scholars suggested that opening the data should be supplemented by visualization. However, different visualizations of the same information can dramatically change an individual’s cognitive and emotional experience in working with the data. This study reviews the data visualization literature to create a list of the methods empirically tested to enhance users’ performance and experience in working with a visualization tool. This list can be used in evaluating the OGD visualization practices and informing the future open data initiatives. Methods: Previous reviews of visualization literature categorized the visualization outcomes into four categories including recall/memorability, insight/comprehension, engagement, and enjoyment. To identify the papers, a search for these outcomes was conducted in the abstract of the publications of top-tier visualization venues including IEEE Transactions for Visualization and Computer Graphics, Computer Graphics, and proceedings of the CHI Conference on Human Factors in Computing Systems. The search results are complemented with a search in the references of the identified articles, and a search for 'open data visualization,' and 'visualization evaluation' keywords in the IEEE explore and ACM digital libraries. Articles are included if they provide empirical evidence through conducting controlled user experiments, or provide a review of these empirical studies. The qualitative synthesis of the studies focuses on identification and classifying the methods, and the conditions under which they are examined to positively affect the visualization outcomes. Findings: The keyword search yields 760 studies, of which 30 are included after the title/abstract review. The classification of the included articles shows five distinct methods: interactive design, aesthetic (artistic) style, storytelling, decorative elements that do not provide extra information including text, image, and embellishment on the graphs), and animation. Studies on decorative elements show consistency on the positive effects of these elements on user engagement and recall but are less consistent in their examination of the user performance. This inconsistency could be attributable to the particular data type or specific design method used in each study. The interactive design studies are consistent in their findings of the positive effect on the outcomes. Storytelling studies show some inconsistencies regarding the design effect on user engagement, enjoyment, recall, and performance, which could be indicative of the specific conditions required for the use of this method. Last two methods, aesthetics and animation, have been less frequent in the included articles, and provide consistent positive results on some of the outcomes. Implications for e-government: Review of the visualization best-practice methods show that each of these methods is beneficial under specific conditions. By using these methods in a potentially beneficial condition, OGD practices can promote a wide range of individuals to involve and work with the government data and ultimately engage in government policy-making procedures.

Keywords: best practices, data visualization, literature review, open government data

Procedia PDF Downloads 107
22792 Machine Learning Facing Behavioral Noise Problem in an Imbalanced Data Using One Side Behavioral Noise Reduction: Application to a Fraud Detection

Authors: Salma El Hajjami, Jamal Malki, Alain Bouju, Mohammed Berrada

Abstract:

With the expansion of machine learning and data mining in the context of Big Data analytics, the common problem that affects data is class imbalance. It refers to an imbalanced distribution of instances belonging to each class. This problem is present in many real world applications such as fraud detection, network intrusion detection, medical diagnostics, etc. In these cases, data instances labeled negatively are significantly more numerous than the instances labeled positively. When this difference is too large, the learning system may face difficulty when tackling this problem, since it is initially designed to work in relatively balanced class distribution scenarios. Another important problem, which usually accompanies these imbalanced data, is the overlapping instances between the two classes. It is commonly referred to as noise or overlapping data. In this article, we propose an approach called: One Side Behavioral Noise Reduction (OSBNR). This approach presents a way to deal with the problem of class imbalance in the presence of a high noise level. OSBNR is based on two steps. Firstly, a cluster analysis is applied to groups similar instances from the minority class into several behavior clusters. Secondly, we select and eliminate the instances of the majority class, considered as behavioral noise, which overlap with behavior clusters of the minority class. The results of experiments carried out on a representative public dataset confirm that the proposed approach is efficient for the treatment of class imbalances in the presence of noise.

Keywords: machine learning, imbalanced data, data mining, big data

Procedia PDF Downloads 131
22791 Mining Multicity Urban Data for Sustainable Population Relocation

Authors: Xu Du, Aparna S. Varde

Abstract:

In this research, we propose to conduct diagnostic and predictive analysis about the key factors and consequences of urban population relocation. To achieve this goal, urban simulation models extract the urban development trends as land use change patterns from a variety of data sources. The results are treated as part of urban big data with other information such as population change and economic conditions. Multiple data mining methods are deployed on this data to analyze nonlinear relationships between parameters. The result determines the driving force of population relocation with respect to urban sprawl and urban sustainability and their related parameters. Experiments so far reveal that data mining methods discover useful knowledge from the multicity urban data. This work sets the stage for developing a comprehensive urban simulation model for catering to specific questions by targeted users. It contributes towards achieving sustainability as a whole.

Keywords: data mining, environmental modeling, sustainability, urban planning

Procedia PDF Downloads 308
22790 Effect of Coal on Engineering Properties in Building Materials: Opportunity to Manufacturing Insulating Bricks

Authors: Bachir Chemani, Halima Chemani

Abstract:

The objective of this study is to investigate the effect of adding coal to obtain insulating ceramic product. The preparation of mixtures is achieved with 04 types of different masse compositions, consisting of gray and yellow clay, and coal. Analyses are performed on local raw materials by adding coal as additive. The coal content varies from 5 to 20 % in weight by varying the size of coal particles ranging from 0.25 mm to 1.60 mm. Initially, each natural moisture content of a raw material has been determined at the temperature of 105°C in a laboratory oven. The Influence of low-coal content on absorption, the apparent density, the contraction and the resistance during compression have been evaluated. The experimental results showed that the optimized composition could be obtained by adding 10% by weight of coal leading thus to insulating ceramic products with water absorption, a density and resistance to compression of 9.40 %, 1.88 g/cm3, 35.46 MPa, respectively. The results show that coal, when mixed with traditional raw materials, offers the conditions to be used as an additive in the production of lightweight ceramic products.

Keywords: clay, coal, resistance to compression, insulating bricks

Procedia PDF Downloads 329
22789 The Problems of Women over 65 with Incontinence Diagnosis: A Case Study in Turkey

Authors: Birsel Canan Demirbag, Kıymet Yesilcicek Calik, Hacer Kobya Bulut

Abstract:

Objective: This study was conducted to evaluate the problems of women over 65 with incontinence diagnosis. Methods: This descriptive study was conducted with women over 65 with incontinence diagnosis in four Family Health Centers in a city in Eastern Black Sea region between November 1, and December 20, 2015. 203, 107, 178, 180 women over 65 were registered in these centers and 262 had incontinence diagnosis at least once and had an ongoing complaint. 177 women were volunteers for the study. During home visits and using face-to-face survey methodology, participants were given socio-demographic characteristics survey, Sandvik severity scale, Incontinence Quality of Life Scale, Urogenital Distress Inventory and a questionnaire including challenges experienced due to incontinence developed by the researcher. Data were analyzed with SPSS program using percentages, numbers, Chi-square, Man-Whitney U and t test with 95% confidence interval and a significance level p <0.05. Findings: 67 ± 1.4 was the mean age, 2.05 ± 0.04 was parity, 44.5 ± 2.12 was menopause age, 66.3% were primary school graduates, 45.7% had deceased spouse, 44.4% lived in a large family, 67.2% had their own room, 77.8% had income, 89.2% could meet self- care, 73.2% had a diagnosis of mixed incontinence, 87.5% suffered for 6-20 years % 78.2 had diuretics, antidepressants and heart medicines, 20.5% had urinary fecal cases, 80.5% had bladder training at least once, 90.1% didn’t have bladder diary calendar/control training programs, 31.1% had hysterectomy for prolapse, 97.1'i% was treated with lower urinary tract infection at least once, 66.3% saw a doctor to get drug in the last three months, 76.2 could not go out alone, 99.2 % had at least one chronic disease, 87.6 % had constipation complain, 2.9% had chronic cough., 45.1% fell due to a sudden rise for toilet. Incontinence Impact Questionnaire Average score was (QOL) 54.3 ± 21.1, Sandvik score was 12.1 ± 2.5, Urogenital Distress Inventory was 47.7 ± 9.2. Difficulties experienced due to incontinence were 99.5% feeling of unhappiness, 67.1% constant feeling of urine smell due to failing to change briefs frequently, % 87.2 move away from social life, 89.7 unable to use pad, 99.2% feeling of disturbing households / other individuals, 87.5% feel dizziness/fall due to sudden rise, 87.4% feeling of others’ imperceptions about the situation, % 94.3 insomnia, 78.2 lack of assistance, 84.7% couldn’t afford urine protection briefs. Results: With this study, it was found out that there were a lot of unsolved issues at individual and community level affecting the life quality of women with incontinence. In accordance with this common problem in women, to facilitate daily life it is obvious that regular home care training programs at institutional level in our country will be effective.

Keywords: health problems, incontinence, incontinence quality of life questionnaire, old age, urinary urogenital distress inventory, Sandviken severity, women

Procedia PDF Downloads 321
22788 Innovative Technologies Functional Methods of Dental Research

Authors: Sergey N. Ermoliev, Margarita A. Belousova, Aida D. Goncharenko

Abstract:

Application of the diagnostic complex of highly informative functional methods (electromyography, reodentography, laser Doppler flowmetry, reoperiodontography, vital computer capillaroscopy, optical tissue oximetry, laser fluorescence diagnosis) allows to perform a multifactorial analysis of the dental status and to prescribe complex etiopathogenetic treatment. Introduction. It is necessary to create a complex of innovative highly informative and safe functional diagnostic methods for improvement of the quality of patient treatment by the early detection of stomatologic diseases. The purpose of the present study was to investigate the etiology and pathogenesis of functional disorders identified in the pathology of hard tissue, dental pulp, periodontal, oral mucosa and chewing function, and the creation of new approaches to the diagnosis of dental diseases. Material and methods. 172 patients were examined. Density of hard tissues of the teeth and jaw bone was studied by intraoral ultrasonic densitometry (USD). Electromyographic activity of masticatory muscles was assessed by electromyography (EMG). Functional state of dental pulp vessels assessed by reodentography (RDG) and laser Doppler flowmetry (LDF). Reoperiodontography method (RPG) studied regional blood flow in the periodontal tissues. Microcirculatory vascular periodontal studied by vital computer capillaroscopy (VCC) and laser Doppler flowmetry (LDF). The metabolic level of the mucous membrane was determined by optical tissue oximetry (OTO) and laser fluorescence diagnosis (LFD). Results and discussion. The results obtained revealed changes in mineral density of hard tissues of the teeth and jaw bone, the bioelectric activity of masticatory muscles, regional blood flow and microcirculation in the dental pulp and periodontal tissues. LDF and OTO methods estimated fluctuations of saturation level and oxygen transport in microvasculature of periodontal tissues. With LFD identified changes in the concentration of enzymes (nicotinamide, flavins, lipofuscin, porphyrins) involved in metabolic processes Conclusion. Our preliminary results confirmed feasibility and safety the of intraoral ultrasound densitometry technique in the density of bone tissue of periodontium. Conclusion. Application of the diagnostic complex of above mentioned highly informative functional methods allows to perform a multifactorial analysis of the dental status and to prescribe complex etiopathogenetic treatment.

Keywords: electromyography (EMG), reodentography (RDG), laser Doppler flowmetry (LDF), reoperiodontography method (RPG), vital computer capillaroscopy (VCC), optical tissue oximetry (OTO), laser fluorescence diagnosis (LFD)

Procedia PDF Downloads 280
22787 The Two Question Challenge: Embedding the Serious Illness Conversation in Acute Care Workflows

Authors: D. M. Lewis, L. Frisby, U. Stead

Abstract:

Objective: Many patients are receiving invasive treatments in acute care or are dying in hospital without having had comprehensive goals of care conversations. Some of these treatments may not align with the patient’s wishes, may be futile, and may cause unnecessary suffering. While many staff may recognize the benefits of engaging patients and families in Serious Illness Conversations (a goal of care framework developed by Ariadne Labs in Boston), few staff feel confident and/or competent in having these conversations in acute care. Another barrier to having these conversations may be due to a lack of incorporation in the current workflow. An educational exercise, titled the Two Question Challenge, was initiated on four medical units across two Vancouver Coastal Health (VCH) hospitals in attempt to engage the entire interdisciplinary team in asking patients and families questions around goals of care and to improve the documentation of these expressed wishes and preferences. Methods: Four acute care units across two separate hospitals participated in the Two Question Challenge. On each unit, over the course of two eight-hour shifts, all members of the interdisciplinary team were asked to select at least two questions from a selection of nine goals of care questions. They were asked to pose these questions of a patient or family member throughout their shift and then asked to document their conversations in a centralized Advance Care Planning/Goals of Care discussion record in the patient’s chart. A visual representation of conversation outcomes was created to demonstrate to staff and patients the breadth of conversations that took place throughout the challenge. Staff and patients were interviewed about their experiences throughout the challenge. Two palliative approach leads remained present on the units throughout the challenge to support, guide, or role model these conversations. Results: Across four acute care medical units, 47 interdisciplinary staff participated in the Two Question Challenge, including nursing, allied health, and a physician. A total of 88 questions were asked of patients, or their families around goals of care and 50 newly documented goals of care conversations were charted. Two code statuses were changed as a result of the conversations. Patients voiced an appreciation for these conversations and staff were able to successfully incorporate these questions into their daily care. Conclusion: The Two Question Challenge proved to be an effective way of having teams explore the goals of care of patients and families in an acute care setting. Staff felt that they gained confidence and competence. Both staff and patients found these conversations to be meaningful and impactful and felt they were notably different from their usual interactions. Documentation of these conversations in a centralized location that is easily accessible to all care providers increased significantly. Application of the Two Question Challenge in non-medical units or other care settings, such as long-term care facilities or community health units, should be explored in the future.

Keywords: advance care planning, goals of care, interdisciplinary, palliative approach, serious illness conversations

Procedia PDF Downloads 101
22786 Methodology: A Review in Modelling and Predictability of Embankment in Soft Ground

Authors: Bhim Kumar Dahal

Abstract:

Transportation network development in the developing country is in rapid pace. The majority of the network belongs to railway and expressway which passes through diverse topography, landform and geological conditions despite the avoidance principle during route selection. Construction of such networks demand many low to high embankment which required improvement in the foundation soil. This paper is mainly focused on the various advanced ground improvement techniques used to improve the soft soil, modelling approach and its predictability for embankments construction. The ground improvement techniques can be broadly classified in to three groups i.e. densification group, drainage and consolidation group and reinforcement group which are discussed with some case studies.  Various methods were used in modelling of the embankments from simple 1-dimensional to complex 3-dimensional model using variety of constitutive models. However, the reliability of the predictions is not found systematically improved with the level of sophistication.  And sometimes the predictions are deviated more than 60% to the monitored value besides using same level of erudition. This deviation is found mainly due to the selection of constitutive model, assumptions made during different stages, deviation in the selection of model parameters and simplification during physical modelling of the ground condition. This deviation can be reduced by using optimization process, optimization tools and sensitivity analysis of the model parameters which will guide to select the appropriate model parameters.

Keywords: cement, improvement, physical properties, strength

Procedia PDF Downloads 174
22785 Lean Implementation: Manufacturing vs. Construction a Roadmap for Success

Authors: Patrick Ahern, David Collery

Abstract:

The implementation of lean thinking in the manufacturing industry revolutionized the traditional approach to large-scale production through the process of identifying the waste in each task and putting in place mitigation measures to eliminate the waste in all its forms. The Irish construction industry, however, has been much slower to adopt the principles of lean, opting instead to stick with the traditional approach to construction project delivery which is inherently wasteful. Lean thinking holds the potential to revolutionize the construction industry in a similar manner to the adoption of lean manufacturing. Lean principles present opportunities for reduced project duration, reduced project cost, improved quality, and elimination of re-works and non-value-added activities. The following research has been designed to accumulate research data through available literature, electronic surveys, and interviews. The results show an industry reluctant to accept change and an undefined path to successful lean construction implementation.

Keywords: barriers, lean construction, lean implementation, lean manufacturing, lean philosophy

Procedia PDF Downloads 73
22784 Forced Vibration of a Planar Curved Beam on Pasternak Foundation

Authors: Akif Kutlu, Merve Ermis, Nihal Eratlı, Mehmet H. Omurtag

Abstract:

The objective of this study is to investigate the forced vibration analysis of a planar curved beam lying on elastic foundation by using the mixed finite element method. The finite element formulation is based on the Timoshenko beam theory. In order to solve the problems in frequency domain, the element matrices of two nodded curvilinear elements are transformed into Laplace space. The results are transformed back to the time domain by the well-known numerical Modified Durbin’s transformation algorithm. First, the presented finite element formulation is verified through the forced vibration analysis of a planar curved Timoshenko beam resting on Winkler foundation and the finite element results are compared with the results available in the literature. Then, the forced vibration analysis of a planar curved beam resting on Winkler-Pasternak foundation is conducted.

Keywords: curved beam, dynamic analysis, elastic foundation, finite element method

Procedia PDF Downloads 345
22783 Multiscale Modelling of Textile Reinforced Concrete: A Literature Review

Authors: Anicet Dansou

Abstract:

Textile reinforced concrete (TRC)is increasingly used nowadays in various fields, in particular civil engineering, where it is mainly used for the reinforcement of damaged reinforced concrete structures. TRC is a composite material composed of multi- or uni-axial textile reinforcements coupled with a fine-grained cementitious matrix. The TRC composite is an alternative solution to the traditional Fiber Reinforcement Polymer (FRP) composite. It has good mechanical performance and better temperature stability but also, it makes it possible to meet the criteria of sustainable development better.TRCs are highly anisotropic composite materials with nonlinear hardening behavior; their macroscopic behavior depends on multi-scale mechanisms. The characterization of these materials through numerical simulation has been the subject of many studies. Since TRCs are multiscale material by definition, numerical multi-scale approaches have emerged as one of the most suitable methods for the simulation of TRCs. They aim to incorporate information pertaining to microscale constitute behavior, mesoscale behavior, and macro-scale structure response within a unified model that enables rapid simulation of structures. The computational costs are hence significantly reduced compared to standard simulation at a fine scale. The fine scale information can be implicitly introduced in the macro scale model: approaches of this type are called non-classical. A representative volume element is defined, and the fine scale information are homogenized over it. Analytical and computational homogenization and nested mesh methods belong to these approaches. On the other hand, in classical approaches, the fine scale information are explicitly introduced in the macro scale model. Such approaches pertain to adaptive mesh refinement strategies, sub-modelling, domain decomposition, and multigrid methods This research presents the main principles of numerical multiscale approaches. Advantages and limitations are identified according to several criteria: the assumptions made (fidelity), the number of input parameters required, the calculation costs (efficiency), etc. A bibliographic study of recent results and advances and of the scientific obstacles to be overcome in order to achieve an effective simulation of textile reinforced concrete in civil engineering is presented. A comparative study is further carried out between several methods for the simulation of TRCs used for the structural reinforcement of reinforced concrete structures.

Keywords: composites structures, multiscale methods, numerical modeling, textile reinforced concrete

Procedia PDF Downloads 109
22782 An A-Star Approach for the Quickest Path Problem with Time Windows

Authors: Christofas Stergianos, Jason Atkin, Herve Morvan

Abstract:

As air traffic increases, more airports are interested in utilizing optimization methods. Many processes happen in parallel at an airport, and complex models are needed in order to have a reliable solution that can be implemented for ground movement operations. The ground movement for aircraft in an airport, allocating a path to each aircraft to follow in order to reach their destination (e.g. runway or gate), is one process that could be optimized. The Quickest Path Problem with Time Windows (QPPTW) algorithm has been developed to provide a conflict-free routing of vehicles and has been applied to routing aircraft around an airport. It was subsequently modified to increase the accuracy for airport applications. These modifications take into consideration specific characteristics of the problem, such as: the pushback process, which considers the extra time that is needed for pushing back an aircraft and turning its engines on; stand holding where any waiting should be allocated to the stand; and runway sequencing, where the sequence of the aircraft that take off is optimized and has to be respected. QPPTW involves searching for the quickest path by expanding the search in all directions, similarly to Dijkstra’s algorithm. Finding a way to direct the expansion can potentially assist the search and achieve a better performance. We have further modified the QPPTW algorithm to use a heuristic approach in order to guide the search. This new algorithm is based on the A-star search method but estimates the remaining time (instead of distance) in order to assess how far the target is. It is important to consider the remaining time that it is needed to reach the target, so that delays that are caused by other aircraft can be part of the optimization method. All of the other characteristics are still considered and time windows are still used in order to route multiple aircraft rather than a single aircraft. In this way the quickest path is found for each aircraft while taking into account the movements of the previously routed aircraft. After running experiments using a week of real aircraft data from Zurich Airport, the new algorithm (A-star QPPTW) was found to route aircraft much more quickly, being especially fast in routing the departing aircraft where pushback delays are significant. On average A-star QPPTW could route a full day (755 to 837 aircraft movements) 56% faster than the original algorithm. In total the routing of a full week of aircraft took only 12 seconds with the new algorithm, 15 seconds faster than the original algorithm. For real time application, the algorithm needs to be very fast, and this speed increase will allow us to add additional features and complexity, allowing further integration with other processes in airports and leading to more optimized and environmentally friendly airports.

Keywords: a-star search, airport operations, ground movement optimization, routing and scheduling

Procedia PDF Downloads 231
22781 Analyzing Energy Consumption Behavior of Migrated Population in Turkey Using Bayesian Belief Approach

Authors: Ebru Acuner, Gulgun Kayakutlu, M. Ozgur Kayalica, Sermin Onaygil

Abstract:

In Turkey, emigration, especially from Syria, has been continuously increasing together with rapid urbanization. In parallel to this, total energy consumption has been growing, rapidly. Unfortunately, domestic energy sources could not meet this energy demand. Hence, there is a need for reliable predictions. For this reason, before making a survey study for the migrated people, an informative questionnaire was prepared to take the opinions of the experts on the main drivers that shape the energy consumption behavior of the migrated people. Totally, 17 experts were answered, and they were analyzed by means of Netica program considering Bayesian belief analysis method. In the analysis, factors affecting energy consumption behaviors as well as strategies, institutions, tools and financing methods to change these behaviors towards efficient consumption were investigated. On the basis of the results, it can be concluded that changing the energy consumption behavior of the migrated people is crucial. In order to be successful, electricity and natural gas prices and tariffs in the market should be arranged considering energy efficiency. In addition, support mechanisms by not only the government but also municipalities should be taken into account while preparing related policies. Also, electric appliance producers should develop and implement strategies and action in favor of the usage of more efficient appliances. Last but not least, non-governmental organizations should support the migrated people to improve their awareness on the efficient consumption for the sustainable future.

Keywords: Bayesian belief, behavior, energy consumption, energy efficiency, migrated people

Procedia PDF Downloads 111
22780 Comparative Study and Parallel Implementation of Stochastic Models for Pricing of European Options Portfolios using Monte Carlo Methods

Authors: Vinayak Bassi, Rajpreet Singh

Abstract:

Over the years, with the emergence of sophisticated computers and algorithms, finance has been quantified using computational prowess. Asset valuation has been one of the key components of quantitative finance. In fact, it has become one of the embryonic steps in determining risk related to a portfolio, the main goal of quantitative finance. This study comprises a drawing comparison between valuation output generated by two stochastic dynamic models, namely Black-Scholes and Dupire’s bi-dimensionality model. Both of these models are formulated for computing the valuation function for a portfolio of European options using Monte Carlo simulation methods. Although Monte Carlo algorithms have a slower convergence rate than calculus-based simulation techniques (like FDM), they work quite effectively over high-dimensional dynamic models. A fidelity gap is analyzed between the static (historical) and stochastic inputs for a sample portfolio of underlying assets. In order to enhance the performance efficiency of the model, the study emphasized the use of variable reduction methods and customizing random number generators to implement parallelization. An attempt has been made to further implement the Dupire’s model on a GPU to achieve higher computational performance. Furthermore, ideas have been discussed around the performance enhancement and bottleneck identification related to the implementation of options-pricing models on GPUs.

Keywords: monte carlo, stochastic models, computational finance, parallel programming, scientific computing

Procedia PDF Downloads 162
22779 Evaluation of Quasi-Newton Strategy for Algorithmic Acceleration

Authors: T. Martini, J. M. Martínez

Abstract:

An algorithmic acceleration strategy based on quasi-Newton (or secant) methods is displayed for address the practical problem of accelerating the convergence of the Newton-Lagrange method in the case of convergence to critical multipliers. Since the Newton-Lagrange iteration converges locally at a linear rate, it is natural to conjecture that quasi-Newton methods based on the so called secant equation and some minimal variation principle, could converge superlinearly, thus restoring the convergence properties of Newton's method. This strategy can also be applied to accelerate the convergence of algorithms applied to fixed-points problems. Computational experience is reported illustrating the efficiency of this strategy to solve fixed-point problems with linear convergence rate.

Keywords: algorithmic acceleration, fixed-point problems, nonlinear programming, quasi-newton method

Procedia PDF Downloads 489
22778 Electroencephalogram Based Alzheimer Disease Classification using Machine and Deep Learning Methods

Authors: Carlos Roncero-Parra, Alfonso Parreño-Torres, Jorge Mateo Sotos, Alejandro L. Borja

Abstract:

In this research, different methods based on machine/deep learning algorithms are presented for the classification and diagnosis of patients with mental disorders such as alzheimer. For this purpose, the signals obtained from 32 unipolar electrodes identified by non-invasive EEG were examined, and their basic properties were obtained. More specifically, different well-known machine learning based classifiers have been used, i.e., support vector machine (SVM), Bayesian linear discriminant analysis (BLDA), decision tree (DT), Gaussian Naïve Bayes (GNB), K-nearest neighbor (KNN) and Convolutional Neural Network (CNN). A total of 668 patients from five different hospitals have been studied in the period from 2011 to 2021. The best accuracy is obtained was around 93 % in both ADM and ADA classifications. It can be concluded that such a classification will enable the training of algorithms that can be used to identify and classify different mental disorders with high accuracy.

Keywords: alzheimer, machine learning, deep learning, EEG

Procedia PDF Downloads 127
22777 Safety Validation of Black-Box Autonomous Systems: A Multi-Fidelity Reinforcement Learning Approach

Authors: Jared Beard, Ali Baheri

Abstract:

As autonomous systems become more prominent in society, ensuring their safe application becomes increasingly important. This is clearly demonstrated with autonomous cars traveling through a crowded city or robots traversing a warehouse with heavy equipment. Human environments can be complex, having high dimensional state and action spaces. This gives rise to two problems. One being that analytic solutions may not be possible. The other is that in simulation based approaches, searching the entirety of the problem space could be computationally intractable, ruling out formal methods. To overcome this, approximate solutions may seek to find failures or estimate their likelihood of occurrence. One such approach is adaptive stress testing (AST) which uses reinforcement learning to induce failures in the system. The premise of which is that a learned model can be used to help find new failure scenarios, making better use of simulations. In spite of these failures AST fails to find particularly sparse failures and can be inclined to find similar solutions to those found previously. To help overcome this, multi-fidelity learning can be used to alleviate this overuse of information. That is, information in lower fidelity can simulations can be used to build up samples less expensively, and more effectively cover the solution space to find a broader set of failures. Recent work in multi-fidelity learning has passed information bidirectionally using “knows what it knows” (KWIK) reinforcement learners to minimize the number of samples in high fidelity simulators (thereby reducing computation time and load). The contribution of this work, then, is development of the bidirectional multi-fidelity AST framework. Such an algorithm, uses multi-fidelity KWIK learners in an adversarial context to find failure modes. Thus far, a KWIK learner has been used to train an adversary in a grid world to prevent an agent from reaching its goal; thus demonstrating the utility of KWIK learners in an AST framework. The next step is implementation of the bidirectional multi-fidelity AST framework described. Testing will be conducted in a grid world containing an agent attempting to reach a goal position and adversary tasked with intercepting the agent as demonstrated previously. Fidelities will be modified by adjusting the size of a time-step, with higher-fidelity effectively allowing for more responsive closed loop feedback. Results will compare the single KWIK AST learner with the multi-fidelity algorithm with respect to number of samples, distinct failure modes found, and relative effect of learning after a number of trials.

Keywords: multi-fidelity reinforcement learning, multi-fidelity simulation, safety validation, falsification

Procedia PDF Downloads 157
22776 Modular Data and Calculation Framework for a Technology-based Mapping of the Manufacturing Process According to the Value Stream Management Approach

Authors: Tim Wollert, Fabian Behrendt

Abstract:

Value Stream Management (VSM) is a widely used methodology in the context of Lean Management for improving end-to-end material and information flows from a supplier to a customer from a company’s perspective. Whereas the design principles, e.g. Pull, value-adding, customer-orientation and further ones are still valid against the background of an increasing digitalized and dynamic environment, the methodology itself for mapping a value stream is characterized as time- and resource-intensive due to the high degree of manual activities. The digitalization of processes in the context of Industry 4.0 enables new opportunities to reduce these manual efforts and make the VSM approach more agile. The paper at hand aims at providing a modular data and calculation framework, utilizing the available business data, provided by information and communication technologies for automizing the value stream mapping process with focus on the manufacturing process.

Keywords: lean management 4.0, value stream management (VSM) 4.0, dynamic value stream mapping, enterprise resource planning (ERP)

Procedia PDF Downloads 150
22775 Bearing Capacity Improvement in a Silty Clay Soil with Crushed Polyethylene Terephthalate

Authors: Renzo Palomino, Alessandra Trujillo, Lidia Pacheco

Abstract:

The document presents a study based on the incremental bearing capacity of silty clay soil with the incorporation of crushed PET fibers. For a better understanding of the behavior of soil, it is necessary to know its origin. The analyzed samples came from the subgrade layer of a highway that connects the cities of Muniches and Yurimaguas in Loreto, Peru. The material in this area usually has properties such as low support index, medium to high plasticity, and other characteristics that make it considered a ‘problematic’ soil due to factors such as climate, humidity, and geographical location. In addition, PET fibers are obtained from the decomposition of plastic bottles that are polluting agents with a high production rate in our country; in that sense, their use in a construction process represents a considerable reduction in environmental impact. Moreover, to perform a precise analysis of the behavior of this soil mixed with PET, tests such as the hydrometer test, Proctor and CBR with 15%, 10%, 5%, 4%, 3%, and 1% of PET with respect to the mass of the sample of natural soil were carried out. The results show that when a low percentage of PET is used, the support index increases.

Keywords: environmental impact, geotechnics, PET, silty clay soil

Procedia PDF Downloads 237
22774 Talent Management in Small and Medium Sized Companies: A Multilevel Approach Contextualized in France

Authors: Kousay Abid

Abstract:

The aim of this paper is to better understand talent and talent management (TM) in small French companies as well as in medium-sized ones (SME). While previous empirical investigations have largely focused on multinationals and big companies and concentrated on the Anglo-Saxon context, we focus on the pressing need for implementing TM strategies and practices, not only on a new ground of SME but also within a new European context related to France and the French context. This study also aims at understanding strategies adopted by those firms as means to attract, retain, maintain and to develop talents. We contribute to TM issues by adopting a multilevel approach, holding the goal of reaching a global holistic vision of interactions between various levels while applying TM, to make it more and more familiar to us. A qualitative research methodology based on a multiple-case study design, bottomed firstly on a qualitative survey and secondly on two in-depth case study, both built on interviews, will be used in order to develop an ideal analysis for TM strategies and practices. The findings will be based on data collected from more than 15 French SMEs. Our theoretical contributions are the fruit of context considerations and the dynamic of multilevel approach. Theoretically, we attempt first to clarify how talents and TM are seen and defined in French SMEs and consequently to enrich the literature on TM in SMEs out of the Anglo-Saxon context. Moreover, we seek to understand how SMEs manage jointly their talents and their TM strategies by setting up this contextualized pilot study. As well, we focus on the systematic TM model issue from French SMEs. Our prior managerial goal is to shed light on the need for TM to achieve a better management of these organizations by directing leaders to better identify the talented people whom they hold at all levels. In addition, our TM systematic model strengthens our analysis grid as recommendations for CEO and Human Resource Development (HRD) to make them rethink about the companies’ HR business strategies. Therefore, our outputs present a multiple lever of action that should be taken into consideration while reviewing HR strategies and systems, as well as their impact beyond organizational boundaries.

Keywords: french context, multilevel approach, small and medium-sized enterprises, talent management

Procedia PDF Downloads 180
22773 Autonomous Strategic Aircraft Deconfliction in a Multi-Vehicle Low Altitude Urban Environment

Authors: Loyd R. Hook, Maryam Moharek

Abstract:

With the envisioned future growth of low altitude urban aircraft operations for airborne delivery service and advanced air mobility, strategies to coordinate and deconflict aircraft flight paths must be prioritized. Autonomous coordination and planning of flight trajectories is the preferred approach to the future vision in order to increase safety, density, and efficiency over manual methods employed today. Difficulties arise because any conflict resolution must be constrained by all other aircraft, all airspace restrictions, and all ground-based obstacles in the vicinity. These considerations make pair-wise tactical deconfliction difficult at best and unlikely to find a suitable solution for the entire system of vehicles. In addition, more traditional methods which rely on long time scales and large protected zones will artificially limit vehicle density and drastically decrease efficiency. Instead, strategic planning, which is able to respond to highly dynamic conditions and still account for high density operations, will be required to coordinate multiple vehicles in the highly constrained low altitude urban environment. This paper develops and evaluates such a planning algorithm which can be implemented autonomously across multiple aircraft and situations. Data from this evaluation provide promising results with simulations showing up to 10 aircraft deconflicted through a relatively narrow low-altitude urban canyon without any vehicle to vehicle or obstacle conflict. The algorithm achieves this level of coordination beginning with the assumption that each vehicle is controlled to follow an independently constructed flight path, which is itself free of obstacle conflict and restricted airspace. Then, by preferencing speed change deconfliction maneuvers constrained by the vehicles flight envelope, vehicles can remain as close to the original planned path and prevent cascading vehicle to vehicle conflicts. Performing the search for a set of commands which can simultaneously ensure separation for each pair-wise aircraft interaction and optimize the total velocities of all the aircraft is further complicated by the fact that each aircraft's flight plan could contain multiple segments. This means that relative velocities will change when any aircraft achieves a waypoint and changes course. Additionally, the timing of when that aircraft will achieve a waypoint (or, more directly, the order upon which all of the aircraft will achieve their respective waypoints) will change with the commanded speed. Put all together, the continuous relative velocity of each vehicle pair and the discretized change in relative velocity at waypoints resembles a hybrid reachability problem - a form of control reachability. This paper proposes two methods for finding solutions to these multi-body problems. First, an analytical formulation of the continuous problem is developed with an exhaustive search of the combined state space. However, because of computational complexity, this technique is only computable for pairwise interactions. For more complicated scenarios, including the proposed 10 vehicle example, a discretized search space is used, and a depth-first search with early stopping is employed to find the first solution that solves the constraints.

Keywords: strategic planning, autonomous, aircraft, deconfliction

Procedia PDF Downloads 95
22772 Dietary Flaxseed Decreases Central Blood Pressure and the Concentrations of Plasma Oxylipins Associated with Hypertension in Patients with Peripheral Arterial Disease

Authors: Stephanie PB Caligiuri, Harold M Aukema, Delfin Rodriguez-Leyva, Amir Ravandi, Randy Guzman, Grant N. Pierce

Abstract:

Background: Hypertension leads to cardiac and cerebral events and therefore is the leading risk factor attributed to death in the world. Oxylipins may be mediators in these events as they can regulate vascular tone and inflammation. Oxylipins are derived from fatty acids. Dietary flaxseed is rich in the n3 fatty acid, alpha-linolenic acid, and, therefore, may have the ability to change the substrate profile of oxylipins. As a result, this could alter blood pressure. Methods: A randomized, double-blinded, controlled clinical trial, the Flax-PAD trial, was used to assess the impact of dietary flaxseed on blood pressure (BP), and to also assess the relationship of plasma oxylipins to BP in 81 patients with peripheral arterial disease (PAD). Patients with PAD were chosen for the clinical trial as they are at an increased risk for hypertension and cardiac and cerebral events. Thirty grams of ground flaxseed were added to food products to consume on a daily basis for 6 months. The control food products contained wheat germ, wheat bran, and mixed dietary oils instead of flaxseed. Central BP, which is more significantly associated to organ damage, cardiac, and cerebral events versus brachial BP, was measured by pulse wave analysis at baseline and 6 months. A plasma profile of 43 oxylipins was generated using solid phase extraction, HPLC-MS/MS, and stable isotope dilution quantitation. Results: At baseline, the central BP (systolic/diastolic) in the placebo and flaxseed group were, 131/73 ± 2.5/1.4 mmHg and 128/71 ± 2.6/1.4 mmHg, respectively. After 6 months of intervention, the flaxseed group exhibited a decrease in blood pressure of 4.0/1.0 mmHg. The 6 month central BP in the placebo and flaxseed groups were, 132/74 ± 2.9/1.8 mmHg and 124/70 ± 2.6/1.6 mmHg (P<0.05). Correlation and logistic regression analyses between central blood pressure and oxylipins were performed. Significant associations were observed between central blood pressure and 17 oxylipins, primarily produced from arachidonic acid. Every 1 nM increase in 16-hydroxyeicosatetraenoic acid (HETE) increased the odds of having high central systolic BP by 15-fold, of having high central diastolic BP by 6-fold and of having high central mean arterial pressure by 15-fold. In addition, every 1 nM increase in 5,6-dihydroxyeicosatrienoic acid (DHET) and 11,12-DHET increased the odds of having high central mean arterial pressure by 45- and 18-fold, respectively. Flaxseed induced a significant decrease in these as well as 4 other vasoconstrictive oxylipins. Conclusion: Dietary flaxseed significantly lowered blood pressure in patients with PAD and hypertension. Plasma oxylipins were strongly associated with central blood pressure and may have mediated the flaxseed-induced decrease in blood pressure.

Keywords: hypertension, flaxseed, oxylipins, peripheral arterial disease

Procedia PDF Downloads 468
22771 Performance of Bridge Approach Slabs in Bridge Construction: A Case Study

Authors: Aurora Cerri, Niko Pullojani

Abstract:

Long-term differential settlement between the bridge structure and the bridge embankment typically results in an abrupt grade change, causing driver discomfort, impairing driver safety, and exerting a potentially excessive impact traffic loading on the abutment. This paper has analysed a case of study showing the effect of an approaching slab realized in a bridge constructed at Tirane-Elbasan Motorway. The layer thickness under the slab is modeled as homogenous, the slab is a reinforced concrete structure and over that the asphaltic layers take place. Analysis indicates that reinforced concrete approaching slab distributes the stresses quite uniformly into the road fill layers and settlements varies in a range less than 2.50 cm in the total slab length of 6.00 m with a maximum slope of 1/240. Results taken from analytical analysis are compared with topographic measurements done on field and they carry great similarities.

Keywords: approach slab, bridge, road pavement, differential settlement

Procedia PDF Downloads 221