Search results for: cluster model approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26806

Search results for: cluster model approach

23986 Ensuring Safe Operation by Providing an End-To-End Field Monitoring and Incident Management Approach for Autonomous Vehicle Based on ML/Dl SW Stack

Authors: Lucas Bublitz, Michael Herdrich

Abstract:

By achieving the first commercialization approval in San Francisco the Autonomous Driving (AD) industry proves the technology maturity of the SAE L4 AD systems and the corresponding software and hardware stack. This milestone reflects the upcoming phase in the industry, where the focus is now about scaling and supervising larger autonomous vehicle (AV) fleets in different operation areas. This requires an operation framework, which organizes and assigns responsibilities to the relevant AV technology and operation stakeholders from the AV system provider, the Remote Intervention Operator, the MaaS provider and regulatory & approval authority. This holistic operation framework consists of technological, processual, and organizational activities to ensure safe operation for fully automated vehicles. Regarding the supervision of large autonomous vehicle fleets, a major focus is on the continuous field monitoring. The field monitoring approach must reflect the safety and security criticality of incidents in the field during driving operation. This includes an automatic containment approach, with the overall goal to avoid safety critical incidents and reduce downtime by a malfunction of the AD software stack. An End-to-end (E2E) field monitoring approach detects critical faults in the field, uses a knowledge-based approach for evaluating the safety criticality and supports the automatic containment of these E/E faults. Applying such an approach will ensure the scalability of AV fleets, which is determined by the handling of incidents in the field and the continuous regulatory compliance of the technology after enhancing the Operational Design Domain (ODD) or the function scope by Functions on Demand (FoD) over the entire digital product lifecycle.

Keywords: field monitoring, incident management, multicompliance management for AI in AD, root cause analysis, database approach

Procedia PDF Downloads 53
23985 A Study of Shigeru Ban's Environmentally-Sensitive Design Approach

Authors: Duygu Merve Bulut, Fehime Yesim Gurani

Abstract:

The Japanese architect Shigeru Ban has succeeded in bringing a different understanding to the modern architectural design approach with both the material selection and the techniques he used while combining the material with the design. Ban, who reflects his respect to people and nature with his designs, has encouraged that design should be done with economic materials, easily accessible and understandable for everyone. Because of this, Ban has attracted attention and appreciated in the architectural world with his environmentally-sensitive design ideology and humanitarian projects. In order to understand Ban’s environmentally-sensitive design approach, with this article, Ban’s projects which have used natural materials; the projects of Ban’s Japenese Pavilion in Germany, Papertainer Museum in South Korea, Centre Pompidou-Metz in France and Cardboard Cathedral in New Zealand were examined and analyzed. In the following parts, 'paper tube' technology that creates awareness in architectural area, which developed and applied by Ban; has been examined in terms of building material and structure of sustainable space design. As a result of this review, Ban’s approach is evaluated in terms of its contribution to the understanding of sustainable design.

Keywords: ecological design, environmentally-sensitive design, paper tube, Shigeru Ban, sustainability

Procedia PDF Downloads 475
23984 Generating Product Description with Generative Pre-Trained Transformer 2

Authors: Minh-Thuan Nguyen, Phuong-Thai Nguyen, Van-Vinh Nguyen, Quang-Minh Nguyen

Abstract:

Research on automatically generating descriptions for e-commerce products is gaining increasing attention in recent years. However, the generated descriptions of their systems are often less informative and attractive because of lacking training datasets or the limitation of these approaches, which often use templates or statistical methods. In this paper, we explore a method to generate production descriptions by using the GPT-2 model. In addition, we apply text paraphrasing and task-adaptive pretraining techniques to improve the qualify of descriptions generated from the GPT-2 model. Experiment results show that our models outperform the baseline model through automatic evaluation and human evaluation. Especially, our methods achieve a promising result not only on the seen test set but also in the unseen test set.

Keywords: GPT-2, product description, transformer, task-adaptive, language model, pretraining

Procedia PDF Downloads 186
23983 Machine Learning Framework: Competitive Intelligence and Key Drivers Identification of Market Share Trends among Healthcare Facilities

Authors: Anudeep Appe, Bhanu Poluparthi, Lakshmi Kasivajjula, Udai Mv, Sobha Bagadi, Punya Modi, Aditya Singh, Hemanth Gunupudi, Spenser Troiano, Jeff Paul, Justin Stovall, Justin Yamamoto

Abstract:

The necessity of data-driven decisions in healthcare strategy formulation is rapidly increasing. A reliable framework which helps identify factors impacting a healthcare provider facility or a hospital (from here on termed as facility) market share is of key importance. This pilot study aims at developing a data-driven machine learning-regression framework which aids strategists in formulating key decisions to improve the facility’s market share which in turn impacts in improving the quality of healthcare services. The US (United States) healthcare business is chosen for the study, and the data spanning 60 key facilities in Washington State and about 3 years of historical data is considered. In the current analysis, market share is termed as the ratio of the facility’s encounters to the total encounters among the group of potential competitor facilities. The current study proposes a two-pronged approach of competitor identification and regression approach to evaluate and predict market share, respectively. Leveraged model agnostic technique, SHAP, to quantify the relative importance of features impacting the market share. Typical techniques in literature to quantify the degree of competitiveness among facilities use an empirical method to calculate a competitive factor to interpret the severity of competition. The proposed method identifies a pool of competitors, develops Directed Acyclic Graphs (DAGs) and feature level word vectors, and evaluates the key connected components at the facility level. This technique is robust since its data-driven, which minimizes the bias from empirical techniques. The DAGs factor in partial correlations at various segregations and key demographics of facilities along with a placeholder to factor in various business rules (for ex. quantifying the patient exchanges, provider references, and sister facilities). Identified are the multiple groups of competitors among facilities. Leveraging the competitors' identified developed and fine-tuned Random Forest Regression model to predict the market share. To identify key drivers of market share at an overall level, permutation feature importance of the attributes was calculated. For relative quantification of features at a facility level, incorporated SHAP (SHapley Additive exPlanations), a model agnostic explainer. This helped to identify and rank the attributes at each facility which impacts the market share. This approach proposes an amalgamation of the two popular and efficient modeling practices, viz., machine learning with graphs and tree-based regression techniques to reduce the bias. With these, we helped to drive strategic business decisions.

Keywords: competition, DAGs, facility, healthcare, machine learning, market share, random forest, SHAP

Procedia PDF Downloads 77
23982 The Changing Face of Tourism-Making the Connection through Technological Advancement

Authors: Faduma Ahmed-Ali

Abstract:

The up and coming new generation of travelers will change how the world will achieve its global connectivity. The goal is that through people and technological advancement world-wide, people will be able to better explore the culture and beauty, as well as gain a better understanding of the core values of each host countries treasures. Through Rika's unique world connection model approach, the tourist can explore their destination with the help of local connections. Achieving a complete understanding of the host country while ensuring equal economic prosperity and cultural exchange is key to changing the face of tourism. A recent survey conducted by the author at Portland International Airport shows that over 50% of tourists entering Portland, Oregon are more eager to explore the city through local residents rather than an already planned itinerary created by travel companies. This new model, Rika, aims to shed light to the importance of connecting tourists with the technological tools that increase connectivity to the locals for a better travel experience and that fosters shared economic prosperity throughout a community achieving the goal of creating a sustainable, people driven economy.

Keywords: RIKA, tourism, connection, technology, economic impact, sustainability, hospitality, strategies, tourism development, environment

Procedia PDF Downloads 271
23981 Community Based Tourism and Development in Third World Countries: The Case of the Bamileke Region of Cameroon

Authors: Ngono Mindzeng Terencia

Abstract:

Community based tourism, as a sustainable tourism approach, has been adopted as a tool for development among local communities in third world countries with income generation as the main driver. However, an analysis of community based tourism and development brings to light another driving force which is paramount to development strategies in the difficult conditions of third world countries: this driving force is “place revitalization”. This paper seeks to assess the relevance of “place revitalization” to the enhancement of development within the challenging context of developing countries. The research provides a community based tourism model to development in third world countries through a three step process based on awareness, mentoring and empowerment at the local level. It also tries to examine how effectively this model can address the development problems faced by the local communities of third world countries. The case study for this research is the Bamiléké region of Cameroon, the breeding ground of community based tourism initiatives and a region facing the difficulties of third world countries that are great impediments to community based tourism.

Keywords: awareness, empowerment, local communities, mentoring, place revitalization, third world countries

Procedia PDF Downloads 296
23980 Object Detection in Digital Images under Non-Standardized Conditions Using Illumination and Shadow Filtering

Authors: Waqqas-ur-Rehman Butt, Martin Servin, Marion Pause

Abstract:

In recent years, object detection has gained much attention and very encouraging research area in the field of computer vision. The robust object boundaries detection in an image is demanded in numerous applications of human computer interaction and automated surveillance systems. Many methods and approaches have been developed for automatic object detection in various fields, such as automotive, quality control management and environmental services. Inappropriately, to the best of our knowledge, object detection under illumination with shadow consideration has not been well solved yet. Furthermore, this problem is also one of the major hurdles to keeping an object detection method from the practical applications. This paper presents an approach to automatic object detection in images under non-standardized environmental conditions. A key challenge is how to detect the object, particularly under uneven illumination conditions. Image capturing conditions the algorithms need to consider a variety of possible environmental factors as the colour information, lightening and shadows varies from image to image. Existing methods mostly failed to produce the appropriate result due to variation in colour information, lightening effects, threshold specifications, histogram dependencies and colour ranges. To overcome these limitations we propose an object detection algorithm, with pre-processing methods, to reduce the interference caused by shadow and illumination effects without fixed parameters. We use the Y CrCb colour model without any specific colour ranges and predefined threshold values. The segmented object regions are further classified using morphological operations (Erosion and Dilation) and contours. Proposed approach applied on a large image data set acquired under various environmental conditions for wood stack detection. Experiments show the promising result of the proposed approach in comparison with existing methods.

Keywords: image processing, illumination equalization, shadow filtering, object detection

Procedia PDF Downloads 203
23979 Predicting Depth of Penetration in Abrasive Waterjet Cutting of Polycrystalline Ceramics

Authors: S. Srinivas, N. Ramesh Babu

Abstract:

This paper presents a model to predict the depth of penetration in polycrystalline ceramic material cut by abrasive waterjet. The proposed model considered the interaction of cylindrical jet with target material in upper region and neglected the role of threshold velocity in lower region. The results predicted with the proposed model are validated with the experimental results obtained with Silicon Carbide (SiC) blocks.

Keywords: abrasive waterjet cutting, analytical modeling, ceramics, micro-cutting and inter-grannular cracking

Procedia PDF Downloads 293
23978 Machine Learning for Feature Selection and Classification of Systemic Lupus Erythematosus

Authors: H. Zidoum, A. AlShareedah, S. Al Sawafi, A. Al-Ansari, B. Al Lawati

Abstract:

Systemic lupus erythematosus (SLE) is an autoimmune disease with genetic and environmental components. SLE is characterized by a wide variability of clinical manifestations and a course frequently subject to unpredictable flares. Despite recent progress in classification tools, the early diagnosis of SLE is still an unmet need for many patients. This study proposes an interpretable disease classification model that combines the high and efficient predictive performance of CatBoost and the model-agnostic interpretation tools of Shapley Additive exPlanations (SHAP). The CatBoost model was trained on a local cohort of 219 Omani patients with SLE as well as other control diseases. Furthermore, the SHAP library was used to generate individual explanations of the model's decisions as well as rank clinical features by contribution. Overall, we achieved an AUC score of 0.945, F1-score of 0.92 and identified four clinical features (alopecia, renal disorders, cutaneous lupus, and hemolytic anemia) along with the patient's age that was shown to have the greatest contribution on the prediction.

Keywords: feature selection, classification, systemic lupus erythematosus, model interpretation, SHAP, Catboost

Procedia PDF Downloads 66
23977 Zika Virus NS5 Protein Potential Inhibitors: An Enhanced in silico Approach in Drug Discovery

Authors: Pritika Ramharack, Mahmoud E. S. Soliman

Abstract:

The re-emerging Zika virus is an arthropod-borne virus that has been described to have explosive potential as a worldwide pandemic. The initial transmission of the virus was through a mosquito vector, however, evolving modes of transmission has allowed the spread of the disease over continents. The virus already been linked to irreversible chronic central nervous system (CNS) conditions. The concerns of the scientific and clinical community are the consequences of Zika viral mutations, thus suggesting the urgent need for viral inhibitors. There have been large strides in vaccine development against the virus but there are still no FDA-approved drugs available. Rapid rational drug design and discovery research is fundamental in the production of potent inhibitors against the virus that will not just mask the virus, but destroy it completely. In silico drug design allows for this prompt screening of potential leads, thus decreasing the consumption of precious time and resources. This study demonstrates an optimized and proven screening technique in the discovery of two potential small molecule inhibitors of Zika virus Methyltransferase and RNA-dependent RNA polymerase. This in silico “per-residue energy decomposition pharmacophore” virtual screening approach will be critical in aiding scientists in the discovery of not only effective inhibitors of Zika viral targets, but also a wide range of anti-viral agents.

Keywords: NS5 protein inhibitors, per-residue decomposition, pharmacophore model, virtual screening, Zika virus

Procedia PDF Downloads 209
23976 Stochastic Modeling for Parameters of Modified Car-Following Model in Area-Based Traffic Flow

Authors: N. C. Sarkar, A. Bhaskar, Z. Zheng

Abstract:

The driving behavior in area-based (i.e., non-lane based) traffic is induced by the presence of other individuals in the choice space from the driver’s visual perception area. The driving behavior of a subject vehicle is constrained by the potential leaders and leaders are frequently changed over time. This paper is to determine a stochastic model for a parameter of modified intelligent driver model (MIDM) in area-based traffic (as in developing countries). The parametric and non-parametric distributions are presented to fit the parameters of MIDM. The goodness of fit for each parameter is measured in two different ways such as graphically and statistically. The quantile-quantile (Q-Q) plot is used for a graphical representation of a theoretical distribution to model a parameter and the Kolmogorov-Smirnov (K-S) test is used for a statistical measure of fitness for a parameter with a theoretical distribution. The distributions are performed on a set of estimated parameters of MIDM. The parameters are estimated on the real vehicle trajectory data from India. The fitness of each parameter with a stochastic model is well represented. The results support the applicability of the proposed modeling for parameters of MIDM in area-based traffic flow simulation.

Keywords: area-based traffic, car-following model, micro-simulation, stochastic modeling

Procedia PDF Downloads 133
23975 A New Approach to Increase Consumer Understanding of Meal’s Quality – Food Focus Instead of Nutrient Focus

Authors: Elsa Lamy, Marília Prada, Ada Rocha, Cláudia Viegas

Abstract:

The traditional and widely used nutrition-focused approach to communicate with consumers is reductionist and makes it difficult for consumers to assess their food intake. Without sufficient nutrition knowledge and understanding, it would be difficult to choose a healthful diet based only on nutritional recommendations. This study aimed to evaluate the understanding of how food/nutritional information is presented in menus to Portuguese consumers, comparing the nutrient-focused approach (currently used Nutrition Declaration) and the new food-focused approach (the infographic). For data collection, a questionnaire was distributed online using social media channels. A main effect of format on ratings of meal balance and completeness (Fbalance(1,79) = 18.26, p < .001, ηp2 = .188; Fcompleteness(1,67) = 27.18, p < .001, ηp2 = .289). Overall, dishes paired with the nutritional information were rated as more balanced (Mbalance= 3.70, SE = .11; Mcompleteness = 4.00, SE = .14) than meals with the infographic representation (Mbalance = 3.14, SE = .11; Mcompleteness = 3.29, SE = .13). We also observed a main effect of the meal, F(3,237) = 48.90, p < .001, ηp2 = .382, such that M1 and M2 were perceived as less balanced than the M3 and M4, all p < .001. The use of a food-focused approach (infographic) helped participants identify the lack of balance in the less healthful meals (dishes M1 and M2), allowing for a better understanding of meals' compliance with recommendations contributing to better food choices and a healthier lifestyle.

Keywords: food labelling, food and nutritional recommendations, infographics, portions based information

Procedia PDF Downloads 63
23974 The Effect of Different Strength Training Methods on Muscle Strength, Body Composition and Factors Affecting Endurance Performance

Authors: Shaher A. I. Shalfawi, Fredrik Hviding, Bjornar Kjellstadli

Abstract:

The main purpose of this study was to measure the effect of two different strength training methods on muscle strength, muscle mass, fat mass and endurance factors. Fourteen physical education students accepted to participate in this study. The participants were then randomly divided into three groups, traditional training group (TTG), cluster training group (CTG) and control group (CG). TTG consisted of 4 participants aged ( ± SD) (22.3 ± 1.5 years), body mass (79.2 ± 15.4 kg) and height (178.3 ± 11.9 cm). CTG consisted of 5 participants aged (22.2 ± 3.5 years), body mass (81.0 ± 24.0 kg) and height (180.2 ± 12.3 cm). CG consisted of 5 participants aged (22 ± 2.8 years), body mass (77 ± 19 kg) and height (174 ± 6.7 cm). The participants underwent a hypertrophy strength training program twice a week consisting of 4 sets of 10 reps at 70% of one-repetition maximum (1RM), using barbell squat and barbell bench press for 8 weeks. The CTG performed 2 x 5 reps using 10 s recovery in between repetitions and 50 s recovery between sets, while TTG performed 4 sets of 10 reps with 90 s recovery in between sets. Pre- and post-tests were administrated to assess body composition (weight, muscle mass, and fat mass), 1RM (bench press and barbell squat) and a laboratory endurance test (Bruce Protocol). Instruments used to collect the data were Tanita BC-601 scale (Tanita, Illinois, USA), Woodway treadmill (Woodway, Wisconsin, USA) and Vyntus CPX breath-to-breath system (Jaeger, Hoechberg, Germany). Analysis was conducted at all measured variables including time to peak VO2, peak VO2, heart rate (HR) at peak VO2, respiratory exchange ratio (RER) at peak VO2, and number of breaths per minute. The results indicate an increase in 1RM performance after 8 weeks of training. The change in 1RM squat was for the TTG = 30 ± 3.8 kg, CTG = 28.6 ± 8.3 kg and CG = 10.3 ± 13.8 kg. Similarly, the change in 1RM bench press was for the TTG = 9.8 ± 2.8 kg, CTG = 7.4 ± 3.4 kg and CG = 4.4 ± 3.4 kg. The within-group analysis from the oxygen consumption measured during the incremental exercise indicated that the TTG had only a statistical significant increase in their RER from 1.16 ± 0.04 to 1.23 ± 0.05 (P < 0.05). The CTG had a statistical significant improvement in their HR at peak VO2 from 186 ± 24 to 191 ± 12 Beats Per Minute (P < 0.05) and their RER at peak VO2 from 1.11 ± 0.06 to 1.18 ±0.05 (P < 0.05). Finally, the CG had only a statistical significant increase in their RER at peak VO2 from 1.11 ± 0.07 to 1.21 ± 0.05 (P < 0.05). The between-group analysis showed no statistical differences between all groups in all the measured variables from the oxygen consumption test during the incremental exercise including changes in muscle mass, fat mass, and weight (kg). The results indicate a similar effect of hypertrophy strength training irrespective of the methods of the training used on untrained subjects. Because there were no notable changes in body-composition measures, the results suggest that the improvements in performance observed in all groups is most probably due to neuro-muscular adaptation to training.

Keywords: hypertrophy strength training, cluster set, Bruce protocol, peak VO2

Procedia PDF Downloads 238
23973 A Bathtub Curve from Nonparametric Model

Authors: Eduardo C. Guardia, Jose W. M. Lima, Afonso H. M. Santos

Abstract:

This paper presents a nonparametric method to obtain the hazard rate “Bathtub curve” for power system components. The model is a mixture of the three known phases of a component life, the decreasing failure rate (DFR), the constant failure rate (CFR) and the increasing failure rate (IFR) represented by three parametric Weibull models. The parameters are obtained from a simultaneous fitting process of the model to the Kernel nonparametric hazard rate curve. From the Weibull parameters and failure rate curves the useful lifetime and the characteristic lifetime were defined. To demonstrate the model the historic time-to-failure of distribution transformers were used as an example. The resulted “Bathtub curve” shows the failure rate for the equipment lifetime which can be applied in economic and replacement decision models.

Keywords: bathtub curve, failure analysis, lifetime estimation, parameter estimation, Weibull distribution

Procedia PDF Downloads 430
23972 Character Development Outcomes: A Predictive Model for Behaviour Analysis in Tertiary Institutions

Authors: Rhoda N. Kayongo

Abstract:

As behavior analysts in education continue to debate on how higher institutions can continue to benefit from their social and academic related programs, higher education is facing challenges in the area of character development. This is manifested in the percentages of college completion rates, teen pregnancies, drug abuse, sexual abuse, suicide, plagiarism, lack of academic integrity, and violence among their students. Attending college is a perceived opportunity to positively influence the actions and behaviors of the next generation of society; thus colleges and universities have to provide opportunities to develop students’ values and behaviors. Prior studies were mainly conducted in private institutions and more so in developed countries. However, with the complexity of the nature of student body currently due to the changing world, a multidimensional approach combining multiple factors that enhance character development outcomes is needed to suit the changing trends. The main purpose of this study was to identify opportunities in colleges and develop a model for predicting character development outcomes. A survey questionnaire composed of 7 scales including in-classroom interaction, out-of-classroom interaction, school climate, personal lifestyle, home environment, and peer influence as independent variables and character development outcomes as the dependent variable was administered to a total of five hundred and one students of 3rd and 4th year level in selected public colleges and universities in the Philippines and Rwanda. Using structural equation modelling, a predictive model explained 57% of the variance in character development outcomes. Findings from the results of the analysis showed that in-classroom interactions have a substantial direct influence on character development outcomes of the students (r = .75, p < .05). In addition, out-of-classroom interaction, school climate, and home environment contributed to students’ character development outcomes but in an indirect way. The study concluded that in the classroom are many opportunities for teachers to teach, model and integrate character development among their students. Thus, suggestions are made to public colleges and universities to deliberately boost and implement experiences that cultivate character within the classroom. These may contribute tremendously to the students' character development outcomes and hence render effective models of behaviour analysis in higher education.

Keywords: character development, tertiary institutions, predictive model, behavior analysis

Procedia PDF Downloads 121
23971 Prediction of Nonlinear Torsional Behavior of High Strength RC Beams

Authors: Woo-Young Jung, Minho Kwon

Abstract:

Seismic design criteria based on performance of structures have recently been adopted by practicing engineers in response to destructive earthquakes. A simple but efficient structural-analysis tool capable of predicting both the strength and ductility is needed to analyze reinforced concrete (RC) structures under such event. A three-dimensional lattice model is developed in this study to analyze torsions in high-strength RC members. Optimization techniques for determining optimal variables in each lattice model are introduced. Pure torsion tests of RC members are performed to validate the proposed model. Correlation studies between the numerical and experimental results confirm that the proposed model is well capable of representing salient features of the experimental results.

Keywords: torsion, non-linear analysis, three-dimensional lattice, high-strength concrete

Procedia PDF Downloads 336
23970 A Predictive Model for Turbulence Evolution and Mixing Using Machine Learning

Authors: Yuhang Wang, Jorg Schluter, Sergiy Shelyag

Abstract:

The high cost associated with high-resolution computational fluid dynamics (CFD) is one of the main challenges that inhibit the design, development, and optimisation of new combustion systems adapted for renewable fuels. In this study, we propose a physics-guided CNN-based model to predict turbulence evolution and mixing without requiring a traditional CFD solver. The model architecture is built upon U-Net and the inception module, while a physics-guided loss function is designed by introducing two additional physical constraints to allow for the conservation of both mass and pressure over the entire predicted flow fields. Then, the model is trained on the Large Eddy Simulation (LES) results of a natural turbulent mixing layer with two different Reynolds number cases (Re = 3000 and 30000). As a result, the model prediction shows an excellent agreement with the corresponding CFD solutions in terms of both spatial distributions and temporal evolution of turbulent mixing. Such promising model prediction performance opens up the possibilities of doing accurate high-resolution manifold-based combustion simulations at a low computational cost for accelerating the iterative design process of new combustion systems.

Keywords: computational fluid dynamics, turbulence, machine learning, combustion modelling

Procedia PDF Downloads 73
23969 Community Based Participatory Research in Opioid Use: Design of an Informatics Solution

Authors: Sue S. Feldman, Bradley Tipper, Benjamin Schooley

Abstract:

Nearly every community in the US has been impacted by opioid related addictions/deaths; it is a national problem that is threatening our social and economic welfare. Most believe that tackling this problem from a prevention perspective advances can be made toward breaking the chain of addiction. One mechanism, community based participatory research, involves the community in the prevention approach. This project combines that approach with a design science approach to develop an integrated solution. Findings suggested accountable care communities, transpersonal psychology, and social exchange theory as product kernel theories. Evaluation was conducted on a prototype.

Keywords: substance use and abuse recovery, community resource centers, accountable care communities, community based participatory research

Procedia PDF Downloads 140
23968 Comparison of Homogeneous and Micro-Mechanical Modelling Approach for Paper Honeycomb Materials

Authors: Yiğit Gürler, Berkay Türkcan İmrağ, Taylan Güçkıran, İbrahim Şimşek, Alper Taşdemirci

Abstract:

Paper honeycombs, which is a sandwich structure, consists of two liner faces and one paper honeycomb core. These materials are widely used in the packaging industry due to their low cost, low weight, good energy absorption capabilities and easy recycling properties. However, to provide maximum protection to the products in cases such as the drop of the packaged products, the mechanical behavior of these materials should be well known at the packaging design stage. In this study, the necessary input parameters for the modeling study were obtained by performing compression tests in the through-thickness and in-plane directions of paper-based honeycomb sandwich structures. With the obtained parameters, homogeneous and micro-mechanical numerical models were developed in the Ls-Dyna environment. The material card used for the homogeneous model is MAT_MODIFIED_HONEYCOMB, and the material card used for the micromechanical model is MAT_PIECEWISE_LINEAR_PLASTICITY. As a result, the effectiveness of homogeneous and micromechanical modeling approaches for paper-based honeycomb sandwich structure was investigated using force-displacement curves. Densification points and peak points on these curves will be compared.

Keywords: environmental packaging, mechanical characterization, Ls-Dyna, sandwich structure

Procedia PDF Downloads 176
23967 A Review on Disaster Risk Reduction and Sustainable Development in Nigeria

Authors: Kudu Dangana

Abstract:

The occurrences of disaster often call for the support of both government and non-government organization. Consequently, disaster relief remains extremely important in disaster management. However, this approach alone does not proactively address the need to adduce the human and environment impacts of future disasters. Recent thinking in the area of disaster management is indicative of the need for a new paradigm that focuses on reducing the risk of disasters with the involvement and participation of communities. This paper reviews the need for communities to place more emphasis on a holistic approach to disaster risk reduction. This approach involves risk assessment, risk reduction, early warning and disaster preparedness in order to effectively address the reduction of social, economic, and environmental costs of disasters nationally and at the global level.

Keywords: disaster, early, management, warning, relief, risk vulnerability

Procedia PDF Downloads 633
23966 A Model-Based Approach for Energy Performance Assessment of a Spherical Stationary Reflector/Tracking Absorber Solar Concentrator

Authors: Rosa Christodoulaki, Irene Koronaki, Panagiotis Tsekouras

Abstract:

The aim of this study is to analyze the energy performance of a spherical Stationary Reflector / Tracking Absorber (SRTA) solar concentrator. This type of collector consists of a segment of a spherical mirror placed in a stationary position facing the sun and a cylindrical absorber that tracks the sun by a simple pivoting motion about the center of curvature of the reflector. The energy analysis is performed through the development of a dynamic simulation model in TRNSYS software that calculates the annual heat production and the efficiency of the SRTA solar concentrator. The effect of solar concentrator design features and characteristics, such the reflector material, the reflector diameter, the receiver type, the solar radiation level and the concentration ratio, are discussed in details. Moreover, the energy performance curve of the SRTA solar concentrator, for various temperature differences between the mean fluid temperature and the ambient temperature and radiation intensities is drawn. The results are shown in diagrams, visualizing the effect of solar, optical and thermal parameters to the overall performance of the SRTA solar concentrator throughout the year. The analysis indicates that the SRTA solar concentrator can operate efficiently under a wide range of operating conditions.

Keywords: concentrating solar collector, energy analysis , stationary reflector, tracking absorber

Procedia PDF Downloads 186
23965 Numerical Analysis of Real-Scale Polymer Electrolyte Fuel Cells with Cathode Metal Foam Design

Authors: Jaeseung Lee, Muhammad Faizan Chinannai, Mohamed Hassan Gundu, Hyunchul Ju

Abstract:

In this paper, we numerically investigated the effect of metal foams on a real scale 242.57cm2 (19.1 cm × 12.7 cm) polymer electrolyte membrane fuel cell (PEFCs) using a three-dimensional two-phase PEFC model to substantiate design approach for PEFCs using metal foam as the flow distributor. The simulations were conducted under the practical low humidity hydrogen, and air gases conditions in order to observe the detailed operation result in the PEFCs using the serpentine flow channel in the anode and metal foam design in the cathode. The three-dimensional contours of flow distribution in the channel, current density distribution in the membrane and hydrogen and oxygen concentration distribution are provided. The simulation results revealed that the use of highly porous and permeable metal foam can be beneficial to achieve a more uniform current density distribution and better hydration in the membrane under low inlet humidity conditions. This study offers basic directions to design channel for optimal water management of PEFCs.

Keywords: polymer electrolyte fuel cells, metal foam, real-scale, numerical model

Procedia PDF Downloads 227
23964 The Triple Nexus: Key Challenges in Shifting from Conceptualization to Operationalization of the Humanitarian-Development-Peacebuilding Nexus

Authors: Sarah M. Bolger

Abstract:

There is a clear recognition that humanitarian and development workers are operating more and more frequently in situations of protracted crises, with conflict and violence undermining long-term development efforts. First coined at the World Humanitarian Summit in 2016, the humanitarian-development-peacebuilding nexus – or 'Triple Nexus' - seeks to promote greater cooperation and policy and program coherence amongst organizations working within and across the nexus. However, despite the clear need for such an approach, the Triple Nexus has failed to gain much traction. This is largely due to the lack of conceptual clarity for actors on the ground and the disconnect between the theory of the Triple Nexus and what that means in practice. This paper seeks to identify the key challenges in shifting from the conceptual definition of the Triple Nexus and what that can look like, particularly for multi-mandated organizations, to the operationalization of the Triple Nexus approach. It adopts a case study approach, examining a selection of organizations and programs and their approaches to the Triple Nexus in order to extract key challenges and lessons learned. Finally, key recommendations are provided on how these challenges can be overcome, allowing for the operationalization of the Triple Nexus and ultimately for a more integrated and sustainable approach to humanitarian, development, and peacebuilding work.

Keywords: development, humanitarian, peacebuilding, triple nexus

Procedia PDF Downloads 130
23963 An Elasto-Viscoplastic Constitutive Model for Unsaturated Soils: Numerical Implementation and Validation

Authors: Maria Lazari, Lorenzo Sanavia

Abstract:

Mechanics of unsaturated soils has been an active field of research in the last decades. Efficient constitutive models that take into account the partial saturation of soil are necessary to solve a number of engineering problems e.g. instability of slopes and cuts due to heavy rainfalls. A large number of constitutive models can now be found in the literature that considers fundamental issues associated with the unsaturated soil behaviour, like the volume change and shear strength behaviour with suction or saturation changes. Partially saturated soils may either expand or collapse upon wetting depending on the stress level, and it is also possible that a soil might experience a reversal in the volumetric behaviour during wetting. Shear strength of soils also changes dramatically with changes in the degree of saturation, and a related engineering problem is slope failures caused by rainfall. There are several states of the art reviews over the last years for studying the topic, usually providing a thorough discussion of the stress state, the advantages, and disadvantages of specific constitutive models as well as the latest developments in the area of unsaturated soil modelling. However, only a few studies focused on the coupling between partial saturation states and time effects on the behaviour of geomaterials. Rate dependency is experimentally observed in the mechanical response of granular materials, and a viscoplastic constitutive model is capable of reproducing creep and relaxation processes. Therefore, in this work an elasto-viscoplastic constitutive model for unsaturated soils is proposed and validated on the basis of experimental data. The model constitutes an extension of an existing elastoplastic strain-hardening constitutive model capable of capturing the behaviour of variably saturated soils, based on energy conjugated stress variables in the framework of superposed continua. The purpose was to develop a model able to deal with possible mechanical instabilities within a consistent energy framework. The model shares the same conceptual structure of the elastoplastic laws proposed to deal with bonded geomaterials subject to weathering or diagenesis and is capable of modelling several kinds of instabilities induced by the loss of hydraulic bonding contributions. The novelty of the proposed formulation is enhanced with the incorporation of density dependent stiffness and hardening coefficients in order to allow the modeling of the pycnotropy behaviour of granular materials with a single set of material constants. The model has been implemented in the commercial FE platform PLAXIS, widely used in Europe for advanced geotechnical design. The algorithmic strategies adopted for the stress-point algorithm had to be revised to take into account the different approach adopted by PLAXIS developers in the solution of the discrete non-linear equilibrium equations. An extensive comparison between models with a series of experimental data reported by different authors is presented to validate the model and illustrate the capability of the newly developed model. After the validation, the effectiveness of the viscoplastic model is displayed by numerical simulations of a partially saturated slope failure of the laboratory scale and the effect of viscosity and degree of saturation on slope’s stability is discussed.

Keywords: PLAXIS software, slope, unsaturated soils, Viscoplasticity

Procedia PDF Downloads 209
23962 Prediction of the Tunnel Fire Flame Length by Hybrid Model of Neural Network and Genetic Algorithms

Authors: Behzad Niknam, Kourosh Shahriar, Hassan Madani

Abstract:

This paper demonstrates the applicability of Hybrid Neural Networks that combine with back propagation networks (BPN) and Genetic Algorithms (GAs) for predicting the flame length of tunnel fire A hybrid neural network model has been developed to predict the flame length of tunnel fire based parameters such as Fire Heat Release rate, air velocity, tunnel width, height and cross section area. The network has been trained with experimental data obtained from experimental work. The hybrid neural network model learned the relationship for predicting the flame length in just 3000 training epochs. After successful learning, the model predicted the flame length.

Keywords: tunnel fire, flame length, ANN, genetic algorithm

Procedia PDF Downloads 622
23961 Optimization of Feeder Bus Routes at Urban Rail Transit Stations Based on Link Growth Probability

Authors: Yu Song, Yuefei Jin

Abstract:

Urban public transportation can be integrated when there is an efficient connection between urban rail lines, however, there are currently no effective or quick solutions being investigated for this connection. This paper analyzes the space-time distribution and travel demand of passenger connection travel based on taxi track data and data from the road network, excavates potential bus connection stations based on potential connection demand data, and introduces the link growth probability model in the complex network to solve the basic connection bus lines in order to ascertain the direction of the bus lines that are the most connected given the demand characteristics. Then, a tree view exhaustive approach based on constraints is suggested based on graph theory, which can hasten the convergence of findings while doing chain calculations. This study uses WEI QU NAN Station, the Xi'an Metro Line 2 terminal station in Shaanxi Province, as an illustration, to evaluate the model's and the solution method's efficacy. According to the findings, 153 prospective stations have been dug up in total, the feeder bus network for the entire line has been laid out, and the best route adjustment strategy has been found.

Keywords: feeder bus, route optimization, link growth probability, the graph theory

Procedia PDF Downloads 60
23960 Impact of Economic Globalization on Ecological Footprint in India: Evidenced with Dynamic ARDL Simulations

Authors: Muhammed Ashiq Villanthenkodath, Shreya Pal

Abstract:

Purpose: This study scrutinizes the impact of economic globalization on ecological footprint while endogenizing economic growth and energy consumption from 1990 to 2018 in India. Design/methodology/approach: The standard unit root test has been employed for time series analysis to unveil the integration order. Then, the cointegration was confirmed using autoregressive distributed lag (ARDL) analysis. Further, the study executed the dynamic ARDL simulation model to estimate long-run and short-run results along with simulation and robotic prediction. Findings: The cointegration analysis confirms the existence of a long-run association among variables. Further, economic globalization reduces the ecological footprint in the long run. Similarly, energy consumption decreases the ecological footprint. In contrast, economic growth spurs the ecological footprint in India. Originality/value: This study contributes to the literature in many ways. First, unlike studies that employ CO2 emissions and globalization nexus, this study employs ecological footprint for measuring environmental quality; since it is the broader measure of environmental quality, it can offer a wide range of climate change mitigation policies for India. Second, the study executes a multivariate framework with updated series from 1990 to 2018 in India to explore the link between EF, economic globalization, energy consumption, and economic growth. Third, the dynamic autoregressive distributed lag (ARDL) model has been used to explore the short and long-run association between the series. Finally, to our limited knowledge, this is the first study that uses economic globalization in the EF function of India amid facing a trade-off between sustainable economic growth and the environment in the era of globalization.

Keywords: economic globalization, ecological footprint, India, dynamic ARDL simulation model

Procedia PDF Downloads 108
23959 Biomechanical Performance of the Synovial Capsule of the Glenohumeral Joint with a BANKART Lesion through Finite Element Analysis

Authors: Duvert A. Puentes T., Javier A. Maldonado E., Ivan Quintero., Diego F. Villegas

Abstract:

Mechanical Computation is a great tool to study the performance of complex models. An example of it is the study of the human body structure. This paper took advantage of different types of software to make a 3D model of the glenohumeral joint and apply a finite element analysis. The main objective was to study the change in the biomechanical properties of the joint when it presents an injury. Specifically, a BANKART lesion, which consists in the detachment of the anteroinferior labrum from the glenoid. Stress and strain distribution of the soft tissues were the focus of this study. First, a 3D model was made of a joint without any pathology, as a control sample, using segmentation software for the bones with the support of medical imagery and a cadaveric model to represent the soft tissue. The joint was built to simulate a compression and external rotation test using CAD to prepare the model in the adequate position. When the healthy model was finished, it was submitted to a finite element analysis and the results were validated with experimental model data. With the validated model, it was sensitized to obtain the best mesh measurement. Finally, the geometry of the 3D model was changed to imitate a BANKART lesion. Then, the contact zone of the glenoid with the labrum was slightly separated simulating a tissue detachment. With this new geometry, the finite element analysis was applied again, and the results were compared with the control sample created initially. With the data gathered, this study can be used to improve understanding of the labrum tears. Nevertheless, it is important to remember that the computational analysis are approximations and the initial data was taken from an in vitro assay.

Keywords: biomechanics, computational model, finite elements, glenohumeral joint, bankart lesion, labrum

Procedia PDF Downloads 146
23958 Applied Theory Building to Achieve Success in Iran Municipalities

Authors: Morteza Rahiminejad

Abstract:

There are over 1200 cities and municipalities all around Iran, including 30 mega cities, which municipal organizations, Interior ministries, and city councils supervise. Even so, there has been neither any research about the process of success nor performance assessment in municipalities. In this research an attempt is made to build a comprehensive theory (or model) to show the reasons or success process among the local governments. The present research is based on the contingency approach in which the relevant circumstances are important, and both environment and situations call for their own management methods. The methodology of research is grounded theory, which uses Atlas.ti software as a tool.

Keywords: success, municipality, Iran, theory building

Procedia PDF Downloads 24
23957 Stochastic Matrices and Lp Norms for Ill-Conditioned Linear Systems

Authors: Riadh Zorgati, Thomas Triboulet

Abstract:

In quite diverse application areas such as astronomy, medical imaging, geophysics or nondestructive evaluation, many problems related to calibration, fitting or estimation of a large number of input parameters of a model from a small amount of output noisy data, can be cast as inverse problems. Due to noisy data corruption, insufficient data and model errors, most inverse problems are ill-posed in a Hadamard sense, i.e. existence, uniqueness and stability of the solution are not guaranteed. A wide class of inverse problems in physics relates to the Fredholm equation of the first kind. The ill-posedness of such inverse problem results, after discretization, in a very ill-conditioned linear system of equations, the condition number of the associated matrix can typically range from 109 to 1018. This condition number plays the role of an amplifier of uncertainties on data during inversion and then, renders the inverse problem difficult to handle numerically. Similar problems appear in other areas such as numerical optimization when using interior points algorithms for solving linear programs leads to face ill-conditioned systems of linear equations. Devising efficient solution approaches for such system of equations is therefore of great practical interest. Efficient iterative algorithms are proposed for solving a system of linear equations. The approach is based on a preconditioning of the initial matrix of the system with an approximation of a generalized inverse leading to a stochastic preconditioned matrix. This approach, valid for non-negative matrices, is first extended to hermitian, semi-definite positive matrices and then generalized to any complex rectangular matrices. The main results obtained are as follows: 1) We are able to build a generalized inverse of any complex rectangular matrix which satisfies the convergence condition requested in iterative algorithms for solving a system of linear equations. This completes the (short) list of generalized inverse having this property, after Kaczmarz and Cimmino matrices. Theoretical results on both the characterization of the type of generalized inverse obtained and the convergence are derived. 2) Thanks to its properties, this matrix can be efficiently used in different solving schemes as Richardson-Tanabe or preconditioned conjugate gradients. 3) By using Lp norms, we propose generalized Kaczmarz’s type matrices. We also show how Cimmino's matrix can be considered as a particular case consisting in choosing the Euclidian norm in an asymmetrical structure. 4) Regarding numerical results obtained on some pathological well-known test-cases (Hilbert, Nakasaka, …), some of the proposed algorithms are empirically shown to be more efficient on ill-conditioned problems and more robust to error propagation than the known classical techniques we have tested (Gauss, Moore-Penrose inverse, minimum residue, conjugate gradients, Kaczmarz, Cimmino). We end on a very early prospective application of our approach based on stochastic matrices aiming at computing some parameters (such as the extreme values, the mean, the variance, …) of the solution of a linear system prior to its resolution. Such an approach, if it were to be efficient, would be a source of information on the solution of a system of linear equations.

Keywords: conditioning, generalized inverse, linear system, norms, stochastic matrix

Procedia PDF Downloads 119