Search results for: randomized response model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21123

Search results for: randomized response model

16203 Mathematical Modeling for Continuous Reactive Extrusion of Poly Lactic Acid Formation by Ring Opening Polymerization Considering Metal/Organic Catalyst and Alternative Energies

Authors: Satya P. Dubey, Hrushikesh A Abhyankar, Veronica Marchante, James L. Brighton, Björn Bergmann

Abstract:

Aims: To develop a mathematical model that simulates the ROP of PLA taking into account the effect of alternative energy to be implemented in a continuous reactive extrusion production process of PLA. Introduction: The production of large amount of waste is one of the major challenges at the present time, and polymers represent 70% of global waste. PLA has emerged as a promising polymer as it is compostable, biodegradable thermoplastic polymer made from renewable sources. However, the main limitation for the application of PLA is the traces of toxic metal catalyst in the final product. Thus, a safe and efficient production process needs to be developed to avoid the potential hazards and toxicity. It has been found that alternative energy sources (LASER, ultrasounds, microwaves) could be a prominent option to facilitate the ROP of PLA via continuous reactive extrusion. This process may result in complete extraction of the metal catalysts and facilitate less active organic catalysts. Methodology: Initial investigation were performed using the data available in literature for the reaction mechanism of ROP of PLA based on conventional metal catalyst stannous octoate. A mathematical model has been developed by considering significant parameters such as different initial concentration ratio of catalyst, co-catalyst and impurity. Effects of temperature variation and alternative energies have been implemented in the model. Results: The validation of the mathematical model has been made by using data from literature as well as actual experiments. Validation of the model including alternative energies is in progress based on experimental data for partners of the InnoREX project consortium. Conclusion: The model developed reproduces accurately the polymerisation reaction when applying alternative energy. Alternative energies have a great positive effect to increase the conversion and molecular weight of the PLA. This model could be very useful tool to complement Ludovic® software to predict the large scale production process when using reactive extrusion.

Keywords: polymer, poly-lactic acid (PLA), ring opening polymerization (ROP), metal-catalyst, bio-degradable, renewable source, alternative energy (AE)

Procedia PDF Downloads 351
16202 Additive Weibull Model Using Warranty Claim and Finite Element Analysis Fatigue Analysis

Authors: Kanchan Mondal, Dasharath Koulage, Dattatray Manerikar, Asmita Ghate

Abstract:

This paper presents an additive reliability model using warranty data and Finite Element Analysis (FEA) data. Warranty data for any product gives insight to its underlying issues. This is often used by Reliability Engineers to build prediction model to forecast failure rate of parts. But there is one major limitation in using warranty data for prediction. Warranty periods constitute only a small fraction of total lifetime of a product, most of the time it covers only the infant mortality and useful life zone of a bathtub curve. Predicting with warranty data alone in these cases is not generally provide results with desired accuracy. Failure rate of a mechanical part is driven by random issues initially and wear-out or usage related issues at later stages of the lifetime. For better predictability of failure rate, one need to explore the failure rate behavior at wear out zone of a bathtub curve. Due to cost and time constraints, it is not always possible to test samples till failure, but FEA-Fatigue analysis can provide the failure rate behavior of a part much beyond warranty period in a quicker time and at lesser cost. In this work, the authors proposed an Additive Weibull Model, which make use of both warranty and FEA fatigue analysis data for predicting failure rates. It involves modeling of two data sets of a part, one with existing warranty claims and other with fatigue life data. Hazard rate base Weibull estimation has been used for the modeling the warranty data whereas S-N curved based Weibull parameter estimation is used for FEA data. Two separate Weibull models’ parameters are estimated and combined to form the proposed Additive Weibull Model for prediction.

Keywords: bathtub curve, fatigue, FEA, reliability, warranty, Weibull

Procedia PDF Downloads 58
16201 Heat and Mass Transfer Modelling of Industrial Sludge Drying at Different Pressures and Temperatures

Authors: L. Al Ahmad, C. Latrille, D. Hainos, D. Blanc, M. Clausse

Abstract:

A two-dimensional finite volume axisymmetric model is developed to predict the simultaneous heat and mass transfers during the drying of industrial sludge. The simulations were run using COMSOL-Multiphysics 3.5a. The input parameters of the numerical model were acquired from a preliminary experimental work. Results permit to establish correlations describing the evolution of the various parameters as a function of the drying temperature and the sludge water content. The selection and coupling of the equation are validated based on the drying kinetics acquired experimentally at a temperature range of 45-65 °C and absolute pressure range of 200-1000 mbar. The model, incorporating the heat and mass transfer mechanisms at different operating conditions, shows simulated values of temperature and water content. Simulated results are found concordant with the experimental values, only at the first and last drying stages where sludge shrinkage is insignificant. Simulated and experimental results show that sludge drying is favored at high temperatures and low pressure. As experimentally observed, the drying time is reduced by 68% for drying at 65 °C compared to 45 °C under 1 atm. At 65 °C, a 200-mbar absolute pressure vacuum leads to an additional reduction in drying time estimated by 61%. However, the drying rate is underestimated in the intermediate stage. This rate underestimation could be improved in the model by considering the shrinkage phenomena that occurs during sludge drying.

Keywords: industrial sludge drying, heat transfer, mass transfer, mathematical modelling

Procedia PDF Downloads 114
16200 Improvement in Drying Characteristics of Raisin by Carbonic Maceration– Process Optimization

Authors: Nursac Akyol, Merve S. Turan, Mustafa Ozcelik, Erdogan Kucukoner, Erkan Karacabey

Abstract:

Traditional raisin production is a long time drying process under sunlight. During this procedure, grapes are open to some environmental effects besides the adverse effects of the long drying period. Thus, there is a need to develop an alternative method being applicable instead of traditional one. To this extent, a combination of a potential pretreatment (carbonic maceration, CM) with convectional oven drying was examined. CM application was used in raisin production (grape drying) as a pretreatment process before oven drying. Pressure, temperature and time were examined as application parameters of CM. In conventional oven drying, the temperature is a process variable. The aim is to find out how CM and convectional drying processes affect the drying characteristics of grapes as well as their physical and chemical properties. For this purpose, the response surface method was used to determine both the effects of the variables and the optimum pretreatment and drying conditions. The optimum conditions of CM for raisin production were 0.3 MPa of pressure value, 4°C of application temperature and 8 hours of application time. The optimized drying temperature was 77°C. The results showed that the application of CM before the drying process improved the drying characteristics. Drying took only 389 minutes for grapes pretreated by CM under optimum conditions and 495 minutes for the control group dried only by the conventional drying process. According to these results, a decrease of 21% was achieved in the time requirement for raisin production. Also, it was observed that the samples dried under optimum conditions had similar physical properties as those the control group had. It was seen that raisin, which was dried under optimum conditions were in better condition in terms of some of the bioactive contents compared to control groups. In light of all results, it is seen that CM has an important potential in the industrial drying of grape samples. The current study was financially supported by TUBITAK, Turkey (Project no: 116R038).

Keywords: drying time, pretreatment, response surface methodlogy, total phenolic

Procedia PDF Downloads 116
16199 Topical Negative Pressure for Autologous Fat Grafting in Breast Augmentation

Authors: Mohamed Eftal Bin Mohamed Ebrahim, Alexander Varey

Abstract:

Aim: Topical negative pressure has been shown to enhance angiogenesis during wound healing, both for open and closed wounds. Since angiogenesis is a key requirement for successful fat grafting, there may be a role for topical negative pressure as a means of enhancing the take rate during autologous fat grafting to breasts. Here we present a systematic review of the literature on this topic. Methods: Ovid and Embase were utilized, with searches ranging between 1960 – 2019. Terms (“Liposculpting” OR “Fat grafting” OR “Lipofilling” OR “Lipograft” OR “Fat transfer”) AND (“Negative Pressure” OR “Brava” OR “Kiwi”) AND (“Breast”) were merged as keywords. Inclusion criteria were females, autologous fat graft to breast with topical negative pressure prior to the procedure. Studies were excluded if there was no primary endpoint or non-original article. Results: Upon reviewing 219 articles, 2 met inclusion criteria. A total of 565 and 46 breasts in each article were treated respectively using the negative pressure device BRAVA®, with each cohort having different pre-and post-operative pressure settings. Khouri et al. cohort had higher graft survival (79%) compared to Del Vecchio et al. cohort (64%); however, the latter had fewer complications compared to Khouri’s cohort, e.g., fat necrosis, pneumothorax and infection. Conclusion: There is limited evidence regarding the use of topical negative pressure for fat grafting to the breasts. However, in the two studies published, the reported rates of success are high, suggesting there may be a benefit. Consequently, a randomized controlled trial on this area is required.

Keywords: fat grafting, lipograft, negative pressure, breast, breast augmentation, brava

Procedia PDF Downloads 179
16198 Quadriceps Muscle Activity in Response to Slow and Fast Perturbations following Fatiguing Exercise

Authors: Nosratollah Hedayatpour, Hamid Reza Taheri, Mehrdad Fathi

Abstract:

Introduction: Quadriceps femoris muscle is frequently involved in various movements e.g., jumping, landing) during sport and/or daily activities. During ballistic movement when individuals are faced with unexpected knee perturbation, fast twitch muscle fibers contribute to force production to stabilize knee joint. Fast twitch muscle fiber is more susceptible to fatigue and therefor may reduce the ability of the quadriceps muscle to stabilize knee joint during fast perturbation. Aim: The aim of this study was to investigate the effect of fatigue on postural response of the knee extensor muscles to fast and slow perturbations. Methods: Fatigue was induced to the quadriceps muscle using a KinCom Isokinetic Dynamometer (Chattanooga, TN). Bipolar surface electromyography (EMG) signals were simultaneously recorded from quadriceps components (vastus medialis, rectus femoris, and vastus lateralis) during pre- and post-fatigue postural perturbation performed at two different velocities of 120 ms and 250 mes. Results: One-way ANOVA showed that maximal voluntary knee extension force and time to task failure, and associated EMG activities were significantly reduced after fatiguing knee exercise (P< 0.05). Two-ways ANOVA also showed that ARV of EMG during backward direction was significantly larger than forward direction (P< 0.05), and during fast-perturbation it was significantly higher than slow-perturbation (P< 0.05). Moreover, ARV of EMG was significantly reduced during post fatigue perturbation, with the largest reduction identified for fast-perturbation compared with slow perturbation (P< 0.05). Conclusion: A larger reduction in muscle activity of the quadriceps muscle was observed during post fatigue fast-perturbation to stabilize knee joint, most likely due to preferential recruitment of fast twitch muscle fiber which are more susceptible to fatigue. This may partly explain that why knee injuries is common after fast ballistic movement.

Keywords: electromyography, fast-slow perturbations, fatigue, quadriceps femoris muscle

Procedia PDF Downloads 505
16197 Homeless Population Modeling and Trend Prediction Through Identifying Key Factors and Machine Learning

Authors: Shayla He

Abstract:

Background and Purpose: According to Chamie (2017), it’s estimated that no less than 150 million people, or about 2 percent of the world’s population, are homeless. The homeless population in the United States has grown rapidly in the past four decades. In New York City, the sheltered homeless population has increased from 12,830 in 1983 to 62,679 in 2020. Knowing the trend on the homeless population is crucial at helping the states and the cities make affordable housing plans, and other community service plans ahead of time to better prepare for the situation. This study utilized the data from New York City, examined the key factors associated with the homelessness, and developed systematic modeling to predict homeless populations of the future. Using the best model developed, named HP-RNN, an analysis on the homeless population change during the months of 2020 and 2021, which were impacted by the COVID-19 pandemic, was conducted. Moreover, HP-RNN was tested on the data from Seattle. Methods: The methodology involves four phases in developing robust prediction methods. Phase 1 gathered and analyzed raw data of homeless population and demographic conditions from five urban centers. Phase 2 identified the key factors that contribute to the rate of homelessness. In Phase 3, three models were built using Linear Regression, Random Forest, and Recurrent Neural Network (RNN), respectively, to predict the future trend of society's homeless population. Each model was trained and tuned based on the dataset from New York City for its accuracy measured by Mean Squared Error (MSE). In Phase 4, the final phase, the best model from Phase 3 was evaluated using the data from Seattle that was not part of the model training and tuning process in Phase 3. Results: Compared to the Linear Regression based model used by HUD et al (2019), HP-RNN significantly improved the prediction metrics of Coefficient of Determination (R2) from -11.73 to 0.88 and MSE by 99%. HP-RNN was then validated on the data from Seattle, WA, which showed a peak %error of 14.5% between the actual and the predicted count. Finally, the modeling results were collected to predict the trend during the COVID-19 pandemic. It shows a good correlation between the actual and the predicted homeless population, with the peak %error less than 8.6%. Conclusions and Implications: This work is the first work to apply RNN to model the time series of the homeless related data. The Model shows a close correlation between the actual and the predicted homeless population. There are two major implications of this result. First, the model can be used to predict the homeless population for the next several years, and the prediction can help the states and the cities plan ahead on affordable housing allocation and other community service to better prepare for the future. Moreover, this prediction can serve as a reference to policy makers and legislators as they seek to make changes that may impact the factors closely associated with the future homeless population trend.

Keywords: homeless, prediction, model, RNN

Procedia PDF Downloads 108
16196 Development and Performance Evaluation of a Gladiolus Planter in Field for Planting Corms

Authors: T. P. Singh, Vijay Gautam

Abstract:

Gladiolus is an important cash crop and is grown mainly for its elegant spikes. Traditionally the gladiolus corms are planted manually which is very tedious, time consuming and labor intensive operation. So far, there is no planter available for planting of gladiolus corms. With a view to mechanize the planting operation of this horticultural crop, a prototype of 4-row gladiolus planter was developed and its performance was evaluated in-situ condition. Cup-chain type metering device was used to singulate the gladiolus corms while planting. Three levels of corm spacing viz 15, 20 and 25 cm and four levels of forward speed viz 1.0, 1.5, 2.0 and 2.5 km/h was taken as evaluation parameter for the planter. The performance indicators namely corm spacing in each row, coefficient of uniformity, missing index, multiple index, quality of feed index, number of corms per meter length, mechanical damage to the corms etc. were determined during the field test. The data was statistically analyzed using Completely Randomized Design (CRD) for testing the significance of the parameters. The result indicated that planter was able to drop the corms at required nominal spacing with minor variations. The highest deviation from the mean corm spacing was observed as 3.53 cm with maximum coefficient of variation as 13.88%. The highest missing and quality of feed indexes were observed as 6.33% and 97.45% respectively with no multiples. The performance of the planter was observed better at lower forward speed and wider corm spacing. The field capacity of the planter was found as 0.103 ha/h with an observed field efficiency of 76.57%.

Keywords: coefficient of uniformity, corm spacing, gladiolus planter, mechanization

Procedia PDF Downloads 225
16195 Investigations of Bergy Bits and Ship Interactions in Extreme Waves Using Smoothed Particle Hydrodynamics

Authors: Mohammed Islam, Jungyong Wang, Dong Cheol Seo

Abstract:

The Smoothed Particle Hydrodynamics (SPH) method is a novel, meshless, and Lagrangian technique based numerical method that has shown promises to accurately predict the hydrodynamics of water and structure interactions in violent flow conditions. The main goal of this study is to build confidence on the versatility of the Smoothed Particle Hydrodynamics (SPH) based tool, to use it as a complementary tool to the physical model testing capabilities and support research need for the performance evaluation of ships and offshore platforms exposed to an extreme and harsh environment. In the current endeavor, an open-sourced SPH-based tool was used and validated for modeling and predictions of the hydrodynamic interactions of a 6-DOF ship and bergy bits. The study involved the modeling of a modern generic drillship and simplified bergy bits in floating and towing scenarios and in regular and irregular wave conditions. The predictions were validated using the model-scale measurements on a moored ship towed at multiple oblique angles approaching a floating bergy bit in waves. Overall, this study results in a thorough comparison between the model scale measurements and the prediction outcomes from the SPH tool for performance and accuracy. The SPH predicted ship motions and forces were primarily within ±5% of the measurements. The velocity and pressure distribution and wave characteristics over the free surface depicts realistic interactions of the wave, ship, and the bergy bit. This work identifies and presents several challenges in preparing the input file, particularly while defining the mass properties of complex geometry, the computational requirements, and the post-processing of the outcomes.

Keywords: SPH, ship and bergy bit, hydrodynamic interactions, model validation, physical model testing

Procedia PDF Downloads 122
16194 Structure Function and Violation of Scale Invariance in NCSM: Theory and Numerical Analysis

Authors: M. R. Bekli, N. Mebarki, I. Chadou

Abstract:

In this study, we focus on the structure functions and violation of scale invariance in the context of non-commutative standard model (NCSM). We find that this violation appears in the first order of perturbation theory and a non-commutative version of the DGLAP evolution equation is deduced. Numerical analysis and comparison with experimental data imposes a new bound on the non-commutative parameter.

Keywords: NCSM, structure function, DGLAP equation, standard model

Procedia PDF Downloads 602
16193 Comparing Forecasting Performances of the Bass Diffusion Model and Time Series Methods for Sales of Electric Vehicles

Authors: Andreas Gohs, Reinhold Kosfeld

Abstract:

This study should be of interest for practitioners who want to predict precisely the sales numbers of vehicles equipped with an innovative propulsion technology as well as for researchers interested in applied (regional) time series analysis. The study is based on the numbers of new registrations of pure electric and hybrid cars. Methods of time series analysis like ARIMA are compared with the Bass Diffusion-model concerning their forecasting performances for new registrations in Germany at the national and federal state levels. Especially it is investigated if the additional information content from regional data increases the forecasting accuracy for the national level by adding predictions for the federal states. Results of parameters of the Bass Diffusion Model estimated for Germany and its sixteen federal states are reported. While the focus of this research is on the German market, estimation results are also provided for selected European and other countries. Concerning Bass-parameters and forecasting performances, we get very different results for Germany's federal states and the member states of the European Union. This corresponds to differences across the EU-member states in the adoption process of this innovative technology. Concerning the German market, the adoption is rather proceeded in southern Germany and stays behind in Eastern Germany except for Berlin.

Keywords: bass diffusion model, electric vehicles, forecasting performance, market diffusion

Procedia PDF Downloads 149
16192 Management of Local Towns (Tambon) According to Philosophy of Sufficiency Economy

Authors: Wichian Sriprachan, Chutikarn Sriviboon

Abstract:

The objectives of this research were to study the management of local towns and to develop a better model of town management according to the Philosophy of Sufficiency Economy. This study utilized qualitative research, field research, as well as documentary research at the same time. A total of 10 local towns or Tambons of Supanburi province, Thailand were selected for an in-depth interview. The findings revealed that the model of local town management according to Philosophy of Sufficient Economy was in a level of “good” and the model of management has the five basic guidelines: 1) ability to manage budget information and keep it up-to-date, 2) ability to decision making according to democracy rules, 3) ability to use check and balance system, 4) ability to control, follow, and evaluation, and 5) ability to allow the general public to participate. In addition, the findings also revealed that the human resource management according to Philosophy of Sufficient Economy includes obeying laws, using proper knowledge, and having integrity in five areas: plan, recruit, select, train, and maintain human resources.

Keywords: management, local town (Tambon), principles of sufficiency economy, marketing management

Procedia PDF Downloads 331
16191 Hybrid Model: An Integration of Machine Learning with Traditional Scorecards

Authors: Golnush Masghati-Amoli, Paul Chin

Abstract:

Over the past recent years, with the rapid increases in data availability and computing power, Machine Learning (ML) techniques have been called on in a range of different industries for their strong predictive capability. However, the use of Machine Learning in commercial banking has been limited due to a special challenge imposed by numerous regulations that require lenders to be able to explain their analytic models, not only to regulators but often to consumers. In other words, although Machine Leaning techniques enable better prediction with a higher level of accuracy, in comparison with other industries, they are adopted less frequently in commercial banking especially for scoring purposes. This is due to the fact that Machine Learning techniques are often considered as a black box and fail to provide information on why a certain risk score is given to a customer. In order to bridge this gap between the explain-ability and performance of Machine Learning techniques, a Hybrid Model is developed at Dun and Bradstreet that is focused on blending Machine Learning algorithms with traditional approaches such as scorecards. The Hybrid Model maximizes efficiency of traditional scorecards by merging its practical benefits, such as explain-ability and the ability to input domain knowledge, with the deep insights of Machine Learning techniques which can uncover patterns scorecard approaches cannot. First, through development of Machine Learning models, engineered features and latent variables and feature interactions that demonstrate high information value in the prediction of customer risk are identified. Then, these features are employed to introduce observed non-linear relationships between the explanatory and dependent variables into traditional scorecards. Moreover, instead of directly computing the Weight of Evidence (WoE) from good and bad data points, the Hybrid Model tries to match the score distribution generated by a Machine Learning algorithm, which ends up providing an estimate of the WoE for each bin. This capability helps to build powerful scorecards with sparse cases that cannot be achieved with traditional approaches. The proposed Hybrid Model is tested on different portfolios where a significant gap is observed between the performance of traditional scorecards and Machine Learning models. The result of analysis shows that Hybrid Model can improve the performance of traditional scorecards by introducing non-linear relationships between explanatory and target variables from Machine Learning models into traditional scorecards. Also, it is observed that in some scenarios the Hybrid Model can be almost as predictive as the Machine Learning techniques while being as transparent as traditional scorecards. Therefore, it is concluded that, with the use of Hybrid Model, Machine Learning algorithms can be used in the commercial banking industry without being concerned with difficulties in explaining the models for regulatory purposes.

Keywords: machine learning algorithms, scorecard, commercial banking, consumer risk, feature engineering

Procedia PDF Downloads 120
16190 Business-to-Business Deals Based on a Co-Utile Collaboration Mechanism: Designing Trust Company of the Future

Authors: Riccardo Bonazzi, Michaël Poli, Abeba Nigussie Turi

Abstract:

This paper presents an applied research of a new module for the financial administration and management industry, Personalizable and Automated Checklists Integrator, Overseeing Legal Investigations (PACIOLI). It aims at designing the business model of the trust company of the future. By identifying the key stakeholders, we draw a general business process design of the industry. The business model focuses on disintermediating the traditional form of business through the new technological solutions of a software company based in Switzerland and hence creating a new interactive platform. The key stakeholders of this interactive platform are identified as IT experts, legal experts, and the New Edge Trust Company (NATC). The mechanism we design and propose has a great importance in improving the efficiency of the financial business administration and management industry, and it also helps to foster the provision of high value added services in the sector.

Keywords: new edge trust company, business model design, automated checklists, financial technology

Procedia PDF Downloads 353
16189 Demonstration of Land Use Changes Simulation Using Urban Climate Model

Authors: Barbara Vojvodikova, Katerina Jupova, Iva Ticha

Abstract:

Cities in their historical evolution have always adapted their internal structure to the needs of society (for example protective city walls during classicism era lost their defense function, became unnecessary, were demolished and gave space for new features such as roads, museums or parks). Today it is necessary to modify the internal structure of the city in order to minimize the impact of climate changes on the environment of the population. This article discusses the results of the Urban Climate model owned by VITO, which was carried out as part of a project from the European Union's Horizon grant agreement No 730004 Pan-European Urban Climate Services Climate-Fit city. The use of the model was aimed at changes in land use and land cover in cities related to urban heat islands (UHI). The task of the application was to evaluate possible land use change scenarios in connection with city requirements and ideas. Two pilot areas in the Czech Republic were selected. One is Ostrava and the other Hodonín. The paper provides a demonstration of the application of the model for various possible future development scenarios. It contains an assessment of the suitability or inappropriateness of scenarios of future development depending on the temperature increase. Cities that are preparing to reconstruct the public space are interested in eliminating proposals that would lead to an increase in temperature stress as early as in the assignment phase. If they have evaluation on the unsuitability of some type of design, they can limit it into the proposal phases. Therefore, especially in the application of models on Local level - in 1 m spatial resolution, it was necessary to show which type of proposals would create a significant temperature island in its implementation. Such a type of proposal is considered unsuitable. The model shows that the building itself can create a shady place and thus contribute to the reduction of the UHI. If it sensitively approaches the protection of existing greenery, this new construction may not pose a significant problem. More massive interventions leading to the reduction of existing greenery create a new heat island space.

Keywords: climate model, heat islands, Hodonin, land use changes, Ostrava

Procedia PDF Downloads 123
16188 Development of a 3D Model of Real Estate Properties in Fort Bonifacio, Taguig City, Philippines Using Geographic Information Systems

Authors: Lyka Selene Magnayi, Marcos Vinas, Roseanne Ramos

Abstract:

As the real estate industry continually grows in the Philippines, Geographic Information Systems (GIS) provide advantages in generating spatial databases for efficient delivery of information and services. The real estate sector is not only providing qualitative data about real estate properties but also utilizes various spatial aspects of these properties for different applications such as hazard mapping and assessment. In this study, a three-dimensional (3D) model and a spatial database of real estate properties in Fort Bonifacio, Taguig City are developed using GIS and SketchUp. Spatial datasets include political boundaries, buildings, road network, digital terrain model (DTM) derived from Interferometric Synthetic Aperture Radar (IFSAR) image, Google Earth satellite imageries, and hazard maps. Multiple model layers were created based on property listings by a partner real estate company, including existing and future property buildings. Actual building dimensions, building facade, and building floorplans are incorporated in these 3D models for geovisualization. Hazard model layers are determined through spatial overlays, and different scenarios of hazards are also presented in the models. Animated maps and walkthrough videos were created for company presentation and evaluation. Model evaluation is conducted through client surveys requiring scores in terms of the appropriateness, information content, and design of the 3D models. Survey results show very satisfactory ratings, with the highest average evaluation score equivalent to 9.21 out of 10. The output maps and videos obtained passing rates based on the criteria and standards set by the intended users of the partner real estate company. The methodologies presented in this study were found useful and have remarkable advantages in the real estate industry. This work may be extended to automated mapping and creation of online spatial databases for better storage, access of real property listings and interactive platform using web-based GIS.

Keywords: geovisualization, geographic information systems, GIS, real estate, spatial database, three-dimensional model

Procedia PDF Downloads 152
16187 Quasistationary States and Mean Field Model

Authors: Sergio Curilef, Boris Atenas

Abstract:

Systems with long-range interactions are very common in nature. They are observed from the atomic scale to the astronomical scale and exhibit anomalies, such as inequivalence of ensembles, negative heat capacity, ergodicity breaking, nonequilibrium phase transitions, quasistationary states, and anomalous diffusion. These anomalies are exacerbated when special initial conditions are imposed; in particular, we use the so-called water bag initial conditions that stand for a uniform distribution. Several theoretical and practical implications are discussed here. A potential energy inspired by dipole-dipole interactions is proposed to build the dipole-type Hamiltonian mean-field model. As expected, the dynamics is novel and general to the behavior of systems with long-range interactions, which is obtained through molecular dynamics technique. Two plateaus sequentially emerge before arriving at equilibrium, which are corresponding to two different quasistationary states. The first plateau is a type of quasistationary state the lifetime of which depends on a power law of N and the second plateau seems to be a true quasistationary state as reported in the literature. The general behavior of the model according to its dynamics and thermodynamics is described. Using numerical simulation we characterize the mean kinetic energy, caloric curve, and the diffusion law through the mean square of displacement. The present challenge is to characterize the distributions in phase space. Certainly, the equilibrium state is well characterized by the Gaussian distribution, but quasistationary states in general depart from any Gaussian function.

Keywords: dipole-type interactions, dynamics and thermodynamics, mean field model, quasistationary states

Procedia PDF Downloads 201
16186 Teaching, Learning and Evaluation Enhancement of Information Communication Technology Education in Schools through Pedagogical and E-Learning Techniques in the Sri Lankan Context

Authors: M. G. N. A. S. Fernando

Abstract:

This study uses a researchable framework to improve the quality of ICT education and the Teaching Learning Assessment/ Evaluation (TLA/TLE) process. It utilizes existing resources while improving the methodologies along with pedagogical techniques and e-Learning approaches used in the secondary schools of Sri Lanka. The study was carried out in two phases. Phase I focused on investigating the factors which affect the quality of ICT education. Based on the key factors of phase I, the Phase II focused on the design of an Experimental Application Model with 6 activity levels. Each Level in the Activity Model covers one or more levels in the Revised Bloom’s Taxonomy. Towards further enhancement of activity levels, other pedagogical techniques (activity based learning, e-learning techniques, problem solving activities and peer discussions etc.) were incorporated to each level in the activity model as appropriate. The application model was validated by a panel of teachers including a domain expert and was tested in the school environment too. The validity of performance was proved using 6 hypotheses testing and other methodologies. The analysis shows that student performance with problem solving activities increased by 19.5% due to the different treatment levels used. Compared to existing process it was also proved that the embedded techniques (mixture of traditional and modern pedagogical methods and their applications) are more effective with skills development of teachers and students.

Keywords: activity models, Bloom’s taxonomy, ICT education, pedagogies

Procedia PDF Downloads 149
16185 Evaluation of Different Cowpea Genotypes Using Grain Yield and Canning Quality Traits

Authors: Magdeline Pakeng Mohlala, R. L. Molatudi, M. A. Mofokeng

Abstract:

Cowpea (Vigna unguiculata (L.) Walp) is an important annual leguminous crop in semi-arid and tropics. Most of cowpea grain production in South Africa is mainly used for domestic consumption, as seed planting and little or none gets to be used in industrial processing; thus, there is a need to expand the utilization of cowpea through industrial processing. Agronomic traits contribute to the understanding of the association between yield and its component traits to facilitate effective selection for yield improvement. The aim of this study was to evaluate cowpea genotypes using grain yield and canning quality traits. The field experiment was conducted in two locations in Limpopo Province, namely Syferkuil Agricultural Experimental farm and Ga-Molepo village during 2017/2018 growing season and canning took place at ARC-Grain Crops Potchefstroom. The experiment comprised of 100 cowpea genotypes laid out in a Randomized Complete Block Designs (RCBD). The grain yield, yield components, and canning quality traits were analysed using Genstat software. About 62 genotypes were suitable for canning, 38 were not due to their seed coat texture, and water uptake was less than 80% resulting in too soft (mushy) seeds. Grain yield for RV115, 99k-494-6, ITOOK1263, RV111, RV353 and 53 other genotypes recorded high positive association with number of branches, pods per plant, and number of seeds per pod, unshelled weight and shelled weight for Syferkuil than at Ga-Molepo are therefore recommended for canning quality.

Keywords: agronomic traits, canning quality, genotypes, yield

Procedia PDF Downloads 135
16184 Statistical and Analytical Comparison of GIS Overlay Modelings: An Appraisal on Groundwater Prospecting in Precambrian Metamorphics

Authors: Tapas Acharya, Monalisa Mitra

Abstract:

Overlay modeling is the most widely used conventional analysis for spatial decision support system. Overlay modeling requires a set of themes with different weightage computed in varied manners, which gives a resultant input for further integrated analysis. In spite of the popularity and most widely used technique; it gives inconsistent and erroneous results for similar inputs while processed in various GIS overlay techniques. This study is an attempt to compare and analyse the differences in the outputs of different overlay methods using GIS platform with same set of themes of the Precambrian metamorphic to obtain groundwater prospecting in Precambrian metamorphic rocks. The objective of the study is to emphasize the most suitable overlay method for groundwater prospecting in older Precambrian metamorphics. Seven input thematic layers like slope, Digital Elevation Model (DEM), soil thickness, lineament intersection density, average groundwater table fluctuation, stream density and lithology have been used in the spatial overlay models of fuzzy overlay, weighted overlay and weighted sum overlay methods to yield the suitable groundwater prospective zones. Spatial concurrence analysis with high yielding wells of the study area and the statistical comparative studies among the outputs of various overlay models using RStudio reveal that the Weighted Overlay model is the most efficient GIS overlay model to delineate the groundwater prospecting zones in the Precambrian metamorphic rocks.

Keywords: fuzzy overlay, GIS overlay model, groundwater prospecting, Precambrian metamorphics, weighted overlay, weighted sum overlay

Procedia PDF Downloads 113
16183 Corpus-Based Model of Key Concepts Selection for the Master English Language Course "Government Relations"

Authors: Elena Pozdnyakova

Abstract:

“Government Relations” is a field of knowledge presently taught at the majority of universities around the globe. English as the default language can become the language of teaching since the issues discussed are both global and national in character. However for this field of knowledge key concepts and their word representations in English don’t often coincide with those in other languages. International master’s degree students abroad as well as students, taught the course in English at their national universities, are exposed to difficulties, connected with correct conceptualizing of terminology of GR in British and American academic traditions. The study was carried out during the GR English language course elaboration (pilot research: 2013 -2015) at Moscow State Institute of Foreign Relations (University), Russian Federation. Within this period, English language instructors designed and elaborated the three-semester course of GR. Methodologically the course design was based on elaboration model with the special focus on conceptual elaboration sequence and theoretical elaboration sequence. The course designers faced difficulties in concept selection and theoretical elaboration sequence. To improve the results and eliminate the problems with concept selection, a new, corpus-based approach was worked out. The computer-based tool WordSmith 6.0 was used with the aim to build a model of key concept selection. The corpus of GR English texts consisted of 1 million words (the study corpus). The approach was based on measuring effect size, i.e. the percent difference of the frequency of a word in the study corpus when compared to that in the reference corpus. The results obtained proved significant improvement in the process of concept selection. The corpus-based model also facilitated theoretical elaboration of teaching materials.

Keywords: corpus-based study, English as the default language, key concepts, measuring effect size, model of key concept selection

Procedia PDF Downloads 291
16182 Soil Loss Assessment at Steep Slope: A Case Study at the Guthrie Corridor Expressway, Selangor, Malaysia

Authors: Rabiul Islam

Abstract:

The study was in order to assess soil erosion at plot scale Universal Soil Loss Equation (USLE) erosion model and Geographic Information System (GIS) technique have been used for the study 8 plots in Guthrie Corridor Expressway, Selangor, Malaysia. The USLE model estimates an average soil loss soil integrating several factors such as rainfall erosivity factor(R ), Soil erodibility factor (K), slope length and steepness factor (LS), vegetation cover factor as well as conservation practice factor (C &P) and Results shows that the four plots have very low rates of soil loss, i.e. NLDNM, NDNM, PLDM, and NDM having an average soil loss of 0.059, 0.106, 0.386 and 0.372 ton/ha/ year, respectively. The NBNM, PLDNM and NLDM plots had a relatively higher rate of soil loss, with an average of 0.678, 0.757 and 0.493ton/ha/year. Whereas, the NBM is one of the highest rate of soil loss from 0.842 ton/ha/year to maximum 16.466 ton/ha/year. The NBM plot was located at bare the land; hence the magnitude of C factor(C=0.15) was the highest one.

Keywords: USLE model, GIS, Guthrie Corridor Expressway (GCE), Malaysia

Procedia PDF Downloads 514
16181 Selecting Graduates for the Interns’ Award by Using Multisource Feedback Process: Does It Work?

Authors: Kathryn Strachan, Sameer Otoom, Amal AL-Gallaf, Ahmed Al Ansari

Abstract:

Introduction: Introducing a reliable method to select graduates for an award in higher education can be challenging but is not impossible. Multisource feedback (MSF) is a popular assessment tool that relies on evaluations of different groups of people, including physicians and non-physicians. It is useful for assessing several domains, including professionalism, communication and collaboration and may be useful for selecting the best interns to receive a University award. Methods: 16 graduates responded to an invitation to participate in the student award, which was conducted by the Royal College of Surgeons of Ireland-Bahrain Medical University of Bahrain (RCSI Bahrain) using the MSF process. Five individuals from the following categories rated each participant: physicians, nurses, and fellow students. RCSI Bahrain graduates were assessed in the following domains; professionalism, communication, and collaboration. Mean and standard deviation were calculated and the award was given to the graduate who scored the highest among his/her colleagues. Cronbach’s coefficient was used to determine the questionnaire’s internal consistency and reliability. Factor analysis was conducted to examine for the construct validity. Results: 16 graduates participated in the RCSI-Bahrain interns’ award based on the MSF process, giving us a 16.5% response rate. The instrument was found to be suitable for factor analysis and showed 3 factor solutions representing 79.3% of the total variance. Reliability analysis using Cronbach’s α reliability of internal consistency indicated that the full scale of the instrument had high internal consistency (Cronbach’s α 0.98). Conclusion: This study found the MSF process to be reliable and valid for selecting the best graduates for the interns’ awards. However, the low response rates may suggest that the process is not feasible for allowing the majority of the students to participate in the selection process. Further research studies may be required to support the feasibility of the MSF process in selecting graduates for the university award.

Keywords: MSF, RCSI, validity, Bahrain

Procedia PDF Downloads 329
16180 Investigating the Behaviour of Composite Floors (Steel Beams and Concrete Slabs) under Mans Rhythmical Movement

Authors: M. Ali Lotfollahi Yaghin, M. Reza Bagerzadeh Karimi, Ali Rahmani, V. Sadeghi Balkanlou

Abstract:

Structural engineers have long been trying to develop solutions using the full potential of its composing materials. Therefore, there is no doubt that the structural solution progress is directly related to an increase in materials science knowledge. These efforts in conjunction with up-to-date modern construction techniques have led to an extensive use of composite floors in large span structures. On the other hand, the competitive trends of the world market have long been forcing structural engineers to develop minimum weight and labour cost solutions. A direct consequence of this new design trend is a considerable increase in problems related to unwanted floor vibrations. For this reason, the structural floors systems become vulnerable to excessive vibrations produced by impacts such as human rhythmic activities. The main objective of this paper is to present an analysis methodology for the evaluation of the composite floors human comfort. This procedure takes into account a more realistic loading model developed to incorporate the dynamic effects induced by human walking. The investigated structural models were based on various composite floors, with main spans varying from 5 to 10 m. based on an extensive parametric study the composite floors dynamic response, in terms of peak accelerations, was obtained and compared to the limiting values proposed by several authors and design standards. This strategy was adopted to provide a more realistic evaluation for this type of structure when subjected to vibration due to human walking.

Keywords: vibration, resonance, composite floors, people’s rhythmic movement, dynamic analysis, Abaqus software

Procedia PDF Downloads 289
16179 Evaluation of Botanical Plant Powders against Zabrotes subfasciatus (Boheman) (Coleoptera: Bruchidae) in Stored Local Common Bean Varieties

Authors: Fikadu Kifle Hailegeorgis

Abstract:

Common bean is one of the most important sources of protein in Ethiopia and other developing countries. However, the Mexican bean weevil, Zabrotes subfasciatus (Boheman), is a major factor in the storage of common beans that causes losses. Studies were conducted to evaluate the efficacy of botanical powders of Jatropha curcas (L.), Neem/Azadrachta indica, and Parthenium hysterophorus (L) on local common bean varieties against Z subfasciatus at Melkassa Agriculture Research Center. Twenty local common bean varieties were evaluated twice against Z. Subfasciatus in a completely randomized design in three replications at the rate of 0.2g/250g of seed for each experiment. Malathion and untreated were used as standard checks. The result indicated that RAZ White and Round Yellow showed high resistance variety in experiments while Batu and Black showed high susceptible variety in experiments. Jatropha seed powder was the most effective against Z. subfasciatus. Parthenium seed powders and neem leaf powders also indicate promising results. Common beans treated with botanicals significantly (p<0.05) had a higher germination percentage than that of the untreated seed. In general, the results obtained indicated that using bean varieties (RAZ white and Round yellow) and botanicals (Jatropha) seed powder gave the best control of Z. subfasciatus.

Keywords: botanicals, malathion, resistant varieties, Z. subfasciatus

Procedia PDF Downloads 48
16178 Financial Fraud Prediction for Russian Non-Public Firms Using Relational Data

Authors: Natalia Feruleva

Abstract:

The goal of this paper is to develop the fraud risk assessment model basing on both relational and financial data and test the impact of the relationships between Russian non-public companies on the likelihood of financial fraud commitment. Relationships mean various linkages between companies such as parent-subsidiary relationship and person-related relationships. These linkages may provide additional opportunities for committing fraud. Person-related relationships appear when firms share a director, or the director owns another firm. The number of companies belongs to CEO and managed by CEO, the number of subsidiaries was calculated to measure the relationships. Moreover, the dummy variable describing the existence of parent company was also included in model. Control variables such as financial leverage and return on assets were also implemented because they describe the motivating factors of fraud. To check the hypotheses about the influence of the chosen parameters on the likelihood of financial fraud, information about person-related relationships between companies, existence of parent company and subsidiaries, profitability and the level of debt was collected. The resulting sample consists of 160 Russian non-public firms. The sample includes 80 fraudsters and 80 non-fraudsters operating in 2006-2017. The dependent variable is dichotomous, and it takes the value 1 if the firm is engaged in financial crime, otherwise 0. Employing probit model, it was revealed that the number of companies which belong to CEO of the firm or managed by CEO has significant impact on the likelihood of financial fraud. The results obtained indicate that the more companies are affiliated with the CEO, the higher the likelihood that the company will be involved in financial crime. The forecast accuracy of the model is about is 80%. Thus, the model basing on both relational and financial data gives high level of forecast accuracy.

Keywords: financial fraud, fraud prediction, non-public companies, regression analysis, relational data

Procedia PDF Downloads 106
16177 Optimization of Ultrasound-Assisted Extraction of Oil from Spent Coffee Grounds Using a Central Composite Rotatable Design

Authors: Malek Miladi, Miguel Vegara, Maria Perez-Infantes, Khaled Mohamed Ramadan, Antonio Ruiz-Canales, Damaris Nunez-Gomez

Abstract:

Coffee is the second consumed commodity worldwide, yet it also generates colossal waste. Proper management of coffee waste is proposed by converting them into products with higher added value to achieve sustainability of the economic and ecological footprint and protect the environment. Based on this, a study looking at the recovery of coffee waste is becoming more relevant in recent decades. Spent coffee grounds (SCG's) resulted from brewing coffee represents the major waste produced among all coffee industry. The fact that SCGs has no economic value be abundant in nature and industry, do not compete with agriculture and especially its high oil content (between 7-15% from its total dry matter weight depending on the coffee varieties, Arabica or Robusta), encourages its use as a sustainable feedstock for bio-oil production. The bio-oil extraction is a crucial step towards biodiesel production by the transesterification process. However, conventional methods used for oil extraction are not recommended due to their high consumption of energy, time, and generation of toxic volatile organic solvents. Thus, finding a sustainable, economical, and efficient extraction technique is crucial to scale up the process and to ensure more environment-friendly production. Under this perspective, the aim of this work was the statistical study to know an efficient strategy for oil extraction by n-hexane using indirect sonication. The coffee waste mixed Arabica and Robusta, which was used in this work. The temperature effect, sonication time, and solvent-to-solid ratio on the oil yield were statistically investigated as dependent variables by Central Composite Rotatable Design (CCRD) 23. The results were analyzed using STATISTICA 7 StatSoft software. The CCRD showed the significance of all the variables tested (P < 0.05) on the process output. The validation of the model by analysis of variance (ANOVA) showed good adjustment for the results obtained for a 95% confidence interval, and also, the predicted values graph vs. experimental values confirmed the satisfactory correlation between the model results. Besides, the identification of the optimum experimental conditions was based on the study of the surface response graphs (2-D and 3-D) and the critical statistical values. Based on the CCDR results, 29 ºC, 56.6 min, and solvent-to-solid ratio 16 were the better experimental conditions defined statistically for coffee waste oil extraction using n-hexane as solvent. In these conditions, the oil yield was >9% in all cases. The results confirmed the efficiency of using an ultrasound bath in extracting oil as a more economical, green, and efficient way when compared to the Soxhlet method.

Keywords: coffee waste, optimization, oil yield, statistical planning

Procedia PDF Downloads 104
16176 Simulation to Detect Virtual Fractional Flow Reserve in Coronary Artery Idealized Models

Authors: Nabila Jaman, K. E. Hoque, S. Sawall, M. Ferdows

Abstract:

Coronary artery disease (CAD) is one of the most lethal diseases of the cardiovascular diseases. Coronary arteries stenosis and bifurcation angles closely interact for myocardial infarction. We want to use computer-aided design model coupled with computational hemodynamics (CHD) simulation for detecting several types of coronary artery stenosis with different locations in an idealized model for identifying virtual fractional flow reserve (vFFR). The vFFR provides us the information about the severity of stenosis in the computational models. Another goal is that we want to imitate patient-specific computed tomography coronary artery angiography model for constructing our idealized models with different left anterior descending (LAD) and left circumflex (LCx) bifurcation angles. Further, we want to analyze whether the bifurcation angles has an impact on the creation of narrowness in coronary arteries or not. The numerical simulation provides the CHD parameters such as wall shear stress (WSS), velocity magnitude and pressure gradient (PGD) that allow us the information of stenosis condition in the computational domain.

Keywords: CAD, CHD, vFFR, bifurcation angles, coronary stenosis

Procedia PDF Downloads 146
16175 Glutathione S-Transferase (Gstt1) Gene Polymorphism and Lipid Profile in Type 2 Diabetes Mellitus Patients Attending Murtala Muhammad Specialist Hospital Kano, Nigeria

Authors: Rasheed F. G., Hassan H. A., Shehu F. A., Mukhtar M. M., Muhammad Y. Y., Ibrahim S. S., Shehu D., Abdulsalam K., N. Abdullahi

Abstract:

A cross sectional randomized, descriptive cross sectional study was conducted on the frequency of GSTT1 null alleles in patients diagnosed with type-2-diabetes mellitus (T2DM). A total of 40 patients with T2DM and 10 non-diabetic controls were included in the study. GSTT1 null-alleles genotyping was carried out using multiplex PCR amplification to amplify GSTT1 gene (460bp) while using β-globulin (250bp) as an internal control. The results showed that 55% of T2DM patients had BMI within reference limits, 13% are overweight. Additionally, patients with T2DM were found to have significantly higher (p<0.05) serum levels of glucose, total cholesterol, triglyceride and low density lipoprotein. Furthermore, the presence of null genotype of GSTT1 (deletion in GSTT1) was observed in 28% of diabetic patients. Subjects with GSTT1 deletion have significantly higher (p<0.05) levels of serum glucose, low-density lipoprotein and total cholesterol when compared with individuals without deletion (diabetic and non-diabetic). This results suggests that the deletion of GSTT1 gene might serve as a predisposing factor in the development of T2DM and dyslipideamia

Keywords: diabetes, glutathione-S-transferase, lipid profile, PCR, polymorphism.

Procedia PDF Downloads 78
16174 Interoperability of 505th Search and Rescue Group and the 205th Tactical Helicopter Wing of the Philippine Air Force in Search and Rescue Operations: An Assessment

Authors: Ryan C. Igama

Abstract:

The complexity of disaster risk reduction management paved the way for various innovations and approaches to mitigate the loss of lives and casualties during disaster-related situations. The efficiency of doing response operations during disasters relies on the timely and organized deployment of search, rescue and retrieval teams. Indeed, the assistance provided by the search, rescue, and retrieval teams during disaster operations is a critical service needed to further minimize the loss of lives and casualties. The Armed Forces of the Philippines was mandated to provide humanitarian assistance and disaster relief operations during calamities and disasters. Thus, this study “Interoperability of 505TH Search and Rescue Group and the 205TH Tactical Helicopter Wing of the Philippine Air Force in Search and Rescue Operations: An Assessment” was intended to provide substantial information to further strengthen and promote the capabilities of search and rescue operations in the Philippines. Further, this study also aims to assess the interoperability of the 505th Search and Rescue Group of the Philippine Air Force and the 205th Tactical Helicopter Wing Philippine Air Force. This study was undertaken covering the component units in the Philippine Air Force of the Armed Forces of the Philippines – specifically the 505th SRG and the 205th THW as the involved units who also acted as the respondents of the study. The qualitative approach was the mechanism utilized in the form of focused group discussions, key informant interviews, and documentary analysis as primary means to obtain the needed data for the study. Essentially, this study was geared towards the evaluation of the effectiveness of the interoperability of the two (2) involved PAF units during search and rescue operations. Further, it also delved into the identification of the impacts, gaps, and challenges confronted regarding interoperability as to training, equipment, and coordination mechanism vis-à-vis the needed measures for improvement, respectively. The result of the study regarding the interoperability of the two (2) PAF units during search and rescue operations showed that there was a duplication in terms of functions or tasks in HADR activities, specifically during the conduct of air rescue operations in situations like calamities. In addition, it was revealed that there was a lack of equipment and training for the personnel involved in search and rescue operations which is a vital element during calamity response activities. Based on the findings of the study, it was recommended that a strategic planning workshop/activity must be conducted regarding the duties and responsibilities of the personnel involved in the search and rescue operations to address the command and control and interoperability issues of these units. Additionally, the conduct of intensive HADR-related training for the personnel involved in search and rescue operations of the two (2) PAF Units must also be conducted so they can be more proficient in their skills and sustainably increase their knowledge of search and rescue scenarios, including the capabilities of the respective units. Lastly, the updating of existing doctrines or policies must be undertaken to adapt advancement to the evolving situations in search and rescue operations.

Keywords: interoperability, search and rescue capability, humanitarian assistance, disaster response

Procedia PDF Downloads 78