Search results for: mixed effects models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18258

Search results for: mixed effects models

15108 In vitro And in vivo Anticholinesterase Activity of the Volatile Oil of the Aerial Parts of Ocimum Basilicum L. and O. africanum Lour. Growing in Egypt

Authors: Mariane G. Tadros, Shahira M. Ezzat, Maha M. Salama, Mohamed A. Farag

Abstract:

In this study, the in vitro anticholinesterase activity of the volatile oils of both O. basilicum and O. africanum was investigated and both samples showed significant activity. As a result, the major constituents of the two oils were isolated using several column chromatography. Linalool, 1,8-cineol and eugenol were isolated from the volatile oil of O. basilicum and camphor was isolated from the volatile oil of O. africanum. The anticholinesterase activity of the isolated compounds were also evaluated where 1,8-cineol showed the highest inhibitory activity followed by camphor. To confirm these activities, learning and memory enhancing effects were tested in mice. Memory impairment was induced by scopolamine, a cholinergic muscarinic receptor antagonist. Anti-amnesic effects of both volatile oils and their terpenoids were investigated by the passive avoidance task in mice. We also examined their effects on brain acetylcholinesterase activity. Results showed that scopolamine-induced cognitive dysfunction was significantly attenuated by administration of the volatile oils and their terpenoids, eugenol and camphor, in the passive avoidance task and inhibited brain acetylcholinesterase activity. These results suggest that O. basilicum and O. africanum volatile oils can be good candidates for further studies on Alzheimer’s disease via their acetylcholinesterase inhibitory actions.

Keywords: Ocimum baselicum, Ocimum africanum, GC/MS analysis, anticholinesterase

Procedia PDF Downloads 439
15107 The Univalence Principle: Equivalent Mathematical Structures Are Indistinguishable

Authors: Michael Shulman, Paige North, Benedikt Ahrens, Dmitris Tsementzis

Abstract:

The Univalence Principle is the statement that equivalent mathematical structures are indistinguishable. We prove a general version of this principle that applies to all set-based, categorical, and higher-categorical structures defined in a non-algebraic and space-based style, as well as models of higher-order theories such as topological spaces. In particular, we formulate a general definition of indiscernibility for objects of any such structure, and a corresponding univalence condition that generalizes Rezk’s completeness condition for Segal spaces and ensures that all equivalences of structures are levelwise equivalences. Our work builds on Makkai’s First-Order Logic with Dependent Sorts, but is expressed in Voevodsky’s Univalent Foundations (UF), extending previous work on the Structure Identity Principle and univalent categories in UF. This enables indistinguishability to be expressed simply as identification, and yields a formal theory that is interpretable in classical homotopy theory, but also in other higher topos models. It follows that Univalent Foundations is a fully equivalence-invariant foundation for higher-categorical mathematics, as intended by Voevodsky.

Keywords: category theory, higher structures, inverse category, univalence

Procedia PDF Downloads 136
15106 Effects of Pore-Water Pressure on the Motion of Debris Flow

Authors: Meng-Yu Lin, Wan-Ju Lee

Abstract:

Pore-water pressure, which mediates effective stress and shear strength at grain contacts, has a great influence on the motion of debris flow. The factors that control the diffusion of excess pore-water pressure play very important roles in the debris-flow motion. This research investigates these effects by solving the distribution of pore-water pressure numerically in an unsteady, surging motion of debris flow. The governing equations are the depth-averaged equations for the motion of debris-flow surges coupled with the one-dimensional diffusion equation for excess pore-water pressures. The pore-pressure diffusion equation is solved using a Fourier series, which may improve the accuracy of the solution. The motion of debris-flow surge is modelled using a Lagrangian particle method. From the computational results, the effects of pore-pressure diffusivities and the initial excess pore pressure on the formations of debris-flow surges are investigated. Computational results show that the presence of pore water can increase surge velocities and then changes the profiles of depth distribution. Due to the linear distribution of the vertical component of pore-water velocity, pore pressure dissipates rapidly near the bottom and forms a parabolic distribution in the vertical direction. Increases in the diffusivity of pore-water pressure cause the pore pressures decay more rapidly and then decrease the mobility of the surge.

Keywords: debris flow, diffusion, Lagrangian particle method, pore-pressure diffusivity, pore-water pressure

Procedia PDF Downloads 123
15105 Conformance to Spatial Planning between the Kampala Physical Development Plan of 2012 and the Existing Land Use in 2021

Authors: Brendah Nagula, Omolo Fredrick Okalebo, Ronald Ssengendo, Ivan Bamweyana

Abstract:

The Kampala Physical Development Plan (KPDP) was developed in 2012 and projected both long term and short term developments within the City .The purpose of the plan was to not only shape the city into a spatially planned area but also to control the urban sprawl trends that had expanded with pronounced instances of informal settlements. This plan was approved by the National Physical Planning Board and a signature was appended by the Minister in 2013. Much as the KPDP plan has been implemented using different approaches such as detailed planning, development control, subdivision planning, carrying out construction inspections, greening and beautification, there is still limited knowledge on the level of conformance towards this plan. Therefore, it is yet to be determined whether it has been effective in shaping the City into an ideal spatially planned area. Attaining a clear picture of the level of conformance towards the KPDP 2012 through evaluation between the planned and the existing land use in Kampala City was performed. Methods such as Supervised Classification and Post Classification Change Detection were adopted to perform this evaluation. Scrutiny of findings revealed Central Division registered the lowest level of conformance to the planning standards specified in the KPDP 2012 followed by Nakawa, Rubaga, Kawempe, and Makindye. Furthermore, mixed-use development was identified as the land use with the highest level of non-conformity of 25.11% and institutional land use registered the highest level of conformance of 84.45 %. The results show that the aspect of location was not carefully considered while allocating uses in the KPDP whereby areas located near the Central Business District have higher land rents and hence require uses that ensure profit maximization. Also, the prominence of development towards mixed-use denotes an increased demand for land towards compact development that was not catered for in the plan. Therefore in order to transform Kampala city into a spatially planned area, there is need to carefully develop detailed plans especially for all the Central Division planning precincts indicating considerations for land use densification.

Keywords: spatial plan, post classification change detection, Kampala city, landuse

Procedia PDF Downloads 76
15104 Integration of Climatic Factors in the Meta-Population Modelling of the Dynamic of Malaria Transmission, Case of Douala and Yaoundé, Two Cities of Cameroon

Authors: Justin-Herve Noubissi, Jean Claude Kamgang, Eric Ramat, Januarius Asongu, Christophe Cambier

Abstract:

The goal of our study is to analyse the impact of climatic factors in malaria transmission taking into account migration between Douala and Yaoundé, two cities of Cameroon country. We show how variations of climatic factors such as temperature and relative humidity affect the malaria spread. We propose a meta-population model of the dynamic transmission of malaria that evolves in space and time and that takes into account temperature and relative humidity and the migration between Douala and Yaoundé. We also integrate the variation of environmental factors as events also called mathematical impulsion that can disrupt the model evolution at any time. Our modelling has been done using the Discrete EVents System Specification (DEVS) formalism. Our implementation has been done on Virtual Laboratory Environment (VLE) that uses DEVS formalism and abstract simulators for coupling models by integrating the concept of DEVS.

Keywords: compartmental models, DEVS, discrete events, meta-population model, VLE

Procedia PDF Downloads 544
15103 Student Debt Loans and Labor Market Outcomes: A Lesson in Unintended Consequences

Authors: Sun-Ki Choi

Abstract:

The U.S. student loan policy was initiated to improve the equality of educational opportunity and help low-income families to provide higher education opportunities for their children. However, with the increase in the average student loan amount, college graduates with student loans experience problems and restrictions in their early-career choices. This study examines the early career labor market choices of college graduates who obtained student loans to finance their higher education. In this study, National Survey of College Graduates (NSCG) data for 2017 and 2019 was used to estimate the effects of student loans on the employment status and current job wages of graduates with student loans. In the analysis, two groups of workers, those with student loans and those without loans, were compared. Using basic models and Mahalanobis distance matching, it was found that graduates who rely on student loans to finance their education are more likely to participate in the labor market than those who do not. Moreover, in entry-level jobs, graduates with student loans receive lower salaries than those without student loans. College graduates make job-related decisions based on their current and future wages and fringe benefits. Graduates with student loans tend to demonstrate risk-averse behaviors due to their financial restrictions. Thus, student loan debt creates inequity in the early-career labor market for college graduates. Furthermore, this study has implications for policymakers and researchers in terms of the student loan policy.

Keywords: student loan, wage differential, unintended consequences, mahalanobis distance matching

Procedia PDF Downloads 103
15102 In vitro Method to Evaluate the Effect of Steam-Flaking on the Quality of Common Cereal Grains

Authors: Wanbao Chen, Qianqian Yao, Zhenming Zhou

Abstract:

Whole grains with intact pericarp are largely resistant to digestion by ruminants because entire kernels are not conducive to bacterial attachment. But processing methods makes the starch more accessible to microbes, and increases the rate and extent of starch degradation in the rumen. To estimate the feasibility of applying a steam-flaking as the processing technique of grains for ruminants, cereal grains (maize, wheat, barley and sorghum) were processed by steam-flaking (steam temperature 105°C, heating time, 45 min). And chemical analysis, in vitro gas production, volatile fatty acid concentrations, and energetic values were adopted to evaluate the effects of steam-flaking. In vitro cultivation was conducted for 48h with the rumen fluid collected from steers fed a total mixed ration consisted of 40% hay and 60% concentrates. The results showed that steam-flaking processing had a significant effect on the contents of neutral detergent fiber and acid detergent fiber (P < 0.01). The concentration of starch gelatinization degree in all grains was also great improved in steam-flaking grains, as steam-flaking processing disintegrates the crystal structure of cereal starch, which may subsequently facilitate absorption of moisture and swelling. Theoretical maximum gas production after steam-flaking processing showed no great difference. However, compared with intact grains, total gas production at 48 h and the rate of gas production were significantly (P < 0.01) increased in all types of grain. Furthermore, there was no effect of steam-flaking processing on total volatile fatty acid, but a decrease in the ratio between acetate and propionate was observed in the current in vitro fermentation. The present study also found that steam-flaking processing increased (P < 0.05) organic matter digestibility and energy concentration of the grains. The collective findings of the present study suggest that steam-flaking processing of grains could improve their rumen fermentation and energy utilization by ruminants. In conclusion, the utilization of steam-flaking would be practical to improve the quality of common cereal grains.

Keywords: cereal grains, gas production, in vitro rumen fermentation, steam-flaking processing

Procedia PDF Downloads 242
15101 A Decadal Flood Assessment Using Time-Series Satellite Data in Cambodia

Authors: Nguyen-Thanh Son

Abstract:

Flood is among the most frequent and costliest natural hazards. The flood disasters especially affect the poor people in rural areas, who are heavily dependent on agriculture and have lower incomes. Cambodia is identified as one of the most climate-vulnerable countries in the world, ranked 13th out of 181 countries most affected by the impacts of climate change. Flood monitoring is thus a strategic priority at national and regional levels because policymakers need reliable spatial and temporal information on flood-prone areas to form successful monitoring programs to reduce possible impacts on the country’s economy and people’s likelihood. This study aims to develop methods for flood mapping and assessment from MODIS data in Cambodia. We processed the data for the period from 2000 to 2017, following three main steps: (1) data pre-processing to construct smooth time-series vegetation and water surface indices, (2) delineation of flood-prone areas, and (3) accuracy assessment. The results of flood mapping were verified with the ground reference data, indicating the overall accuracy of 88.7% and a Kappa coefficient of 0.77, respectively. These results were reaffirmed by close agreement between the flood-mapping area and ground reference data, with the correlation coefficient of determination (R²) of 0.94. The seasonally flooded areas observed for 2010, 2015, and 2016 were remarkably smaller than other years, mainly attributed to the El Niño weather phenomenon exacerbated by impacts of climate change. Eventually, although several sources potentially lowered the mapping accuracy of flood-prone areas, including image cloud contamination, mixed-pixel issues, and low-resolution bias between the mapping results and ground reference data, our methods indicated the satisfactory results for delineating spatiotemporal evolutions of floods. The results in the form of quantitative information on spatiotemporal flood distributions could be beneficial to policymakers in evaluating their management strategies for mitigating the negative effects of floods on agriculture and people’s likelihood in the country.

Keywords: MODIS, flood, mapping, Cambodia

Procedia PDF Downloads 110
15100 Modeling Fertility and Production of Hazelnut Cultivars through the Artificial Neural Network under Climate Change of Karaj

Authors: Marziyeh Khavari

Abstract:

In recent decades, climate change, global warming, and the growing population worldwide face some challenges, such as increasing food consumption and shortage of resources. Assessing how climate change could disturb crops, especially hazelnut production, seems crucial for sustainable agriculture production. For hazelnut cultivation in the mid-warm condition, such as in Iran, here we present an investigation of climate parameters and how much they are effective on fertility and nut production of hazelnut trees. Therefore, the climate change of the northern zones in Iran has investigated (1960-2017) and was reached an uptrend in temperature. Furthermore, the descriptive analysis performed on six cultivars during seven years shows how this small-scale survey could demonstrate the effects of climate change on hazelnut production and stability. Results showed that some climate parameters are more significant on nut production, such as solar radiation, soil temperature, relative humidity, and precipitation. Moreover, some cultivars have produced more stable production, for instance, Negret and Segorbe, while the Mervill de Boliver recorded the most variation during the study. Another aspect that needs to be met is training and predicting an actual model to simulate nut production through a neural network and linear regression simulation. The study developed and estimated the ANN model's generalization capability with different criteria such as RMSE, SSE, and accuracy factors for dependent and independent variables (environmental and yield traits). The models were trained and tested while the accuracy of the model is proper to predict hazelnut production under fluctuations in weather parameters.

Keywords: climate change, neural network, hazelnut, global warming

Procedia PDF Downloads 116
15099 Reducing the Urban Heat Island Effect by Urban Design Strategies: Case Study of Aksaray Square in Istanbul

Authors: Busra Ekinci

Abstract:

Urban heat island term becomes one of the most important problem in urban areas as a reflection of global warming in local scale last years. Many communities and governments are taking action to reduce heat island effects on urban areas where the half of the world's population live today. At this point, urban design turned out to be an important practice and research area for providing an environmentally sensitive urban development. In this study, mitigating strategies of urban heat island effects by urban design are investigated in Aksaray Square and surroundings in Istanbul. Aksaray is an important historical and commercial center of Istanbul, which has an increasing density due to be the node of urban transportation. Also, Istanbul Metropolitan Municipality prepared an urban design project to respond the needs of growing population in the area for 2018. The purpose of the study is emphasizing the importance of urban design objectives and strategies that are developed to reduce the heat island effects on urban areas. Depending on this, the urban heat island effect of the area was examined based on the albedo (reflectivity) parameter which is the most effective parameter in the formation of the heat island effect in urban areas. Albedo values were calculated by Albedo Viewer web application model that was developed by Energy and Environmental Engineering Department of Kyushu University in Japan. Albedo parameter had examined for the present situation and the planned situation with urban design project. The results show that, the current area has urban heat island potential. With the Aksaray Square Project, the heat island effect on the area can be reduced, but would not be completely prevented. Therefore, urban design strategies had been developed to reduce the island effect in addition to the urban design project of the area. This study proves that urban design objectives and strategies are quite effective to reduce the heat island effects, which negatively affect the social environment and quality of life in urban areas.

Keywords: Albedo, urban design, urban heat island, sustainable design

Procedia PDF Downloads 564
15098 The State Model of Corporate Governance

Authors: Asaiel Alohaly

Abstract:

A theoretical framework for corporate governance is needed to bridge the gap between the corporate governance of private companies and State-owned Enterprises (SOEs). The two dominant models, being shareholder and stakeholder, do not always address the specific requirements and challenges posed by ‘hybrid’ companies; namely, previously national bodies that have been privatised bffu t where the government retains significant control or holds a majority of shareholders. Thus, an exploratory theoretical study is needed to identify how ‘hybrid’ companies should be defined and why the state model should be acknowledged since it is the less conspicuous model in comparison with the shareholder and stakeholder models. This research focuses on ‘the state model of corporate governance to understand the complex ownership, control pattern, goals, and corporate governance of these hybrid companies. The significance of this research lies in the fact that there is a limited available publication on the state model. The outcomes of this research are as follows. It became evident that the state model exists in the ecosystem. However, corporate governance theories have not extensively covered this model. Though, there is a lot being said about it by OECD and the World Bank. In response to this gap between theories and industry practice, this research argues for the state model, which proceeds from an understanding of the institutionally embedded character of hybrid companies where the government is either a majority of the total shares or a controlling shareholder.

Keywords: corporate governance, control, shareholders, state model

Procedia PDF Downloads 129
15097 Generalized Extreme Value Regression with Binary Dependent Variable: An Application for Predicting Meteorological Drought Probabilities

Authors: Retius Chifurira

Abstract:

Logistic regression model is the most used regression model to predict meteorological drought probabilities. When the dependent variable is extreme, the logistic model fails to adequately capture drought probabilities. In order to adequately predict drought probabilities, we use the generalized linear model (GLM) with the quantile function of the generalized extreme value distribution (GEVD) as the link function. The method maximum likelihood estimation is used to estimate the parameters of the generalized extreme value (GEV) regression model. We compare the performance of the logistic and the GEV regression models in predicting drought probabilities for Zimbabwe. The performance of the regression models are assessed using the goodness-of-fit tests, namely; relative root mean square error (RRMSE) and relative mean absolute error (RMAE). Results show that the GEV regression model performs better than the logistic model, thereby providing a good alternative candidate for predicting drought probabilities. This paper provides the first application of GLM derived from extreme value theory to predict drought probabilities for a drought-prone country such as Zimbabwe.

Keywords: generalized extreme value distribution, general linear model, mean annual rainfall, meteorological drought probabilities

Procedia PDF Downloads 186
15096 Antiinflammatory and Antinociceptive of Hydro Alcoholic Tanacetum balsamita L. Extract

Authors: S. Nasri, G. H. Amin, A. Azimi

Abstract:

The use of herbs to treat disease is accompanied with the history of human life. This research is aimed to study the anti-inflammatory and antinociceptive effects of hydroalcoholic extract of aerial parts of "Tanacetum balsamita balsamita". In the experimental studies 144 male mice are used. In the inflammatory test, animals were divided into six groups: Control, positive control (receiving Dexamethason at dose of 15mg/kg), and four experimental groups receiving Tanacetum balsamita balsamita hydroalcoholic extract at doses of 25, 50, 100 and 200mg/kg. Xylene was used to induce inflammation. Formalin was used to study the nociceptive effects. Animals were divided into six groups: control group, positive control group (receiving morphine) and four experimental groups receiving Tanacetum balsamita balsamita (Tb.) hydroalcoholic extract at doses of 25, 50, 100 and 200mg/kg. I.p. injection of drugs or normal saline was performed 30 minutes before test. The data were analyzed by using one way Variance analysis and Tukey post-test. Aerial parts of Tanacetum balsamita balsamita hydroalcoholic extract decreased significantly inflammatory at dose of 200mg/kg (P<0/001) and caused a significant decrease and alleviated the nociception in both first and second phases at doses of 200mg/kg (p<0/001) and 100mg/kg (P<0/05). Tanacetum balsamita balsamita extract has the anti-inflammatory and anti-nociceptive effects which seems to be related with flavonoids especially Quercetin.

Keywords: inflammation, nociception, hydroalcoholic extract, aerial parts of Tanacetum balsamita balsamita L.

Procedia PDF Downloads 189
15095 Consolidated Predictive Model of the Natural History of Breast Cancer Considering Primary Tumor and Secondary Distant Metastases Growth

Authors: Ella Tyuryumina, Alexey Neznanov

Abstract:

This study is an attempt to obtain reliable data on the natural history of breast cancer growth. We analyze the opportunities for using classical mathematical models (exponential and logistic tumor growth models, Gompertz and von Bertalanffy tumor growth models) to try to describe growth of the primary tumor and the secondary distant metastases of human breast cancer. The research aim is to improve predicting accuracy of breast cancer progression using an original mathematical model referred to CoMPaS and corresponding software. We are interested in: 1) modelling the whole natural history of the primary tumor and the secondary distant metastases; 2) developing adequate and precise CoMPaS which reflects relations between the primary tumor and the secondary distant metastases; 3) analyzing the CoMPaS scope of application; 4) implementing the model as a software tool. The foundation of the CoMPaS is the exponential tumor growth model, which is described by determinate nonlinear and linear equations. The CoMPaS corresponds to TNM classification. It allows to calculate different growth periods of the primary tumor and the secondary distant metastases: 1) ‘non-visible period’ for the primary tumor; 2) ‘non-visible period’ for the secondary distant metastases; 3) ‘visible period’ for the secondary distant metastases. The CoMPaS is validated on clinical data of 10-years and 15-years survival depending on the tumor stage and diameter of the primary tumor. The new predictive tool: 1) is a solid foundation to develop future studies of breast cancer growth models; 2) does not require any expensive diagnostic tests; 3) is the first predictor which makes forecast using only current patient data, the others are based on the additional statistical data. The CoMPaS model and predictive software: a) fit to clinical trials data; b) detect different growth periods of the primary tumor and the secondary distant metastases; c) make forecast of the period of the secondary distant metastases appearance; d) have higher average prediction accuracy than the other tools; e) can improve forecasts on survival of breast cancer and facilitate optimization of diagnostic tests. The following are calculated by CoMPaS: the number of doublings for ‘non-visible’ and ‘visible’ growth period of the secondary distant metastases; tumor volume doubling time (days) for ‘non-visible’ and ‘visible’ growth period of the secondary distant metastases. The CoMPaS enables, for the first time, to predict ‘whole natural history’ of the primary tumor and the secondary distant metastases growth on each stage (pT1, pT2, pT3, pT4) relying only on the primary tumor sizes. Summarizing: a) CoMPaS describes correctly the primary tumor growth of IA, IIA, IIB, IIIB (T1-4N0M0) stages without metastases in lymph nodes (N0); b) facilitates the understanding of the appearance period and inception of the secondary distant metastases.

Keywords: breast cancer, exponential growth model, mathematical model, metastases in lymph nodes, primary tumor, survival

Procedia PDF Downloads 331
15094 The Role of Optimization and Machine Learning in e-Commerce Logistics in 2030

Authors: Vincenzo Capalbo, Gianpaolo Ghiani, Emanuele Manni

Abstract:

Global e-commerce sales have reached unprecedented levels in the past few years. As this trend is only predicted to go up as we continue into the ’20s, new challenges will be faced by companies when planning and controlling e-commerce logistics. In this paper, we survey the related literature on Optimization and Machine Learning as well as on combined methodologies. We also identify the distinctive features of next-generation planning algorithms - namely scalability, model-and-run features and learning capabilities - that will be fundamental to cope with the scale and complexity of logistics in the next decade.

Keywords: e-commerce, hardware acceleration, logistics, machine learning, mixed integer programming, optimization

Procedia PDF Downloads 226
15093 Modeling of Long Wave Generation and Propagation via Seabed Deformation

Authors: Chih-Hua Chang

Abstract:

This study uses a three-dimensional (3D) fully nonlinear model to simulate the wave generation problem caused by the movement of the seabed. The numerical model is first simplified into two dimensions and then compared with the existing two-dimensional (2D) experimental data and the 2D numerical results of other shallow-water wave models. Results show that this model is different from the earlier shallow-water wave models, with the phase being closer to the experimental results of wave propagation. The results of this study are also compared with those of the 3D experimental results of other researchers. Satisfactory results can be obtained in both the waveform and the flow field. This study assesses the application of the model to simulate the wave caused by the circular (radius r0) terrain rising or falling (moving distance bm). The influence of wave-making parameters r0 and bm are discussed. This study determines that small-range (e.g., r0 = 2, normalized by the static water depth), rising, or sinking terrain will produce significant wave groups in the far field. For large-scale moving terrain (e.g., r0 = 10), uplift and deformation will potentially generate the leading solitary-like waves in the far field.

Keywords: seismic wave, wave generation, far-field waves, seabed deformation

Procedia PDF Downloads 72
15092 A Hybrid-Evolutionary Optimizer for Modeling the Process of Obtaining Bricks

Authors: Marius Gavrilescu, Sabina-Adriana Floria, Florin Leon, Silvia Curteanu, Costel Anton

Abstract:

Natural sciences provide a wide range of experimental data whose related problems require study and modeling beyond the capabilities of conventional methodologies. Such problems have solution spaces whose complexity and high dimensionality require correspondingly complex regression methods for proper characterization. In this context, we propose an optimization method which consists in a hybrid dual optimizer setup: a global optimizer based on a modified variant of the popular Imperialist Competitive Algorithm (ICA), and a local optimizer based on a gradient descent approach. The ICA is modified such that intermediate solution populations are more quickly and efficiently pruned of low-fitness individuals by appropriately altering the assimilation, revolution and competition phases, which, combined with an initialization strategy based on low-discrepancy sampling, allows for a more effective exploration of the corresponding solution space. Subsequently, gradient-based optimization is used locally to seek the optimal solution in the neighborhoods of the solutions found through the modified ICA. We use this combined approach to find the optimal configuration and weights of a fully-connected neural network, resulting in regression models used to characterize the process of obtained bricks using silicon-based materials. Installations in the raw ceramics industry, i.e., bricks, are characterized by significant energy consumption and large quantities of emissions. Thus, the purpose of our approach is to determine by simulation the working conditions, including the manufacturing mix recipe with the addition of different materials, to minimize the emissions represented by CO and CH4. Our approach determines regression models which perform significantly better than those found using the traditional ICA for the aforementioned problem, resulting in better convergence and a substantially lower error.

Keywords: optimization, biologically inspired algorithm, regression models, bricks, emissions

Procedia PDF Downloads 70
15091 Batch Kinetic, Isotherm and Thermodynamic Studies of Copper (II) Removal from Wastewater Using HDL as Adsorbent

Authors: Nadjet Taoualit, Zoubida Chemat, Djamel-Eddine Hadj-Boussaad

Abstract:

This study aims the removal of copper Cu (II) contained in wastewater by adsorption on a perfect synthesized mud. It is the materials Hydroxides Double Lamellar, HDL, prepared and synthesized by co-precipitation method at constant pH, which requires a simple titration assembly, with an inexpensive and available material in the laboratory, and also allows us better control of the composition of the reaction medium, and gives well crystallized products. A characterization of the adsorbent proved essential. Thus a range of physic-chemical analysis was performed including: FTIR spectroscopy, X-ray diffraction… The adsorption of copper ions was investigated in dispersed medium (batch). A systematic study of various parameters (amount of support, contact time, initial copper concentration, temperature, pH…) was performed. Adsorption kinetic data were tested using pseudo-first order, pseudo-second order, Bangham's equation and intra-particle diffusion models. The equilibrium data were analyzed using Langmuir, Freundlich, Tempkin and other isotherm models at different doses of HDL. The thermodynamics parameters were evaluated at different temperatures. The results have established good potentiality for the HDL to be used as a sorbent for the removal of Copper from wastewater.

Keywords: adsoption, copper, HDL, isotherm

Procedia PDF Downloads 262
15090 An Ensemble Deep Learning Architecture for Imbalanced Classification of Thoracic Surgery Patients

Authors: Saba Ebrahimi, Saeed Ahmadian, Hedie Ashrafi

Abstract:

Selecting appropriate patients for surgery is one of the main issues in thoracic surgery (TS). Both short-term and long-term risks and benefits of surgery must be considered in the patient selection criteria. There are some limitations in the existing datasets of TS patients because of missing values of attributes and imbalanced distribution of survival classes. In this study, a novel ensemble architecture of deep learning networks is proposed based on stacking different linear and non-linear layers to deal with imbalance datasets. The categorical and numerical features are split using different layers with ability to shrink the unnecessary features. Then, after extracting the insight from the raw features, a novel biased-kernel layer is applied to reinforce the gradient of the minority class and cause the network to be trained better comparing the current methods. Finally, the performance and advantages of our proposed model over the existing models are examined for predicting patient survival after thoracic surgery using a real-life clinical data for lung cancer patients.

Keywords: deep learning, ensemble models, imbalanced classification, lung cancer, TS patient selection

Procedia PDF Downloads 126
15089 Effects of the Macro-Scale Investments/Projects to Planning System in Izmir

Authors: Neslihan Karatas, Sibel Ecemis Kilic

Abstract:

This paper aims to examine macro-scale plans and projects/investments which have been prepared for İzmir since The Republican Period. Macro projects that were proposed by central government, local government, industry and urban actors such as the chamber of commerce will be discussed and these projects and its reflections to the city's macro scale planning decisions will be evaluated based on existing development. Effects of macro plans, the related private and public investments, the developments of unplanned/specific projects to the current city form will be discussed. The factors and plans which determine urban form and the problems caused by unanticipated/uncontrolled developments will be evaluated. The proposals will be developed about more efficient planning process.

Keywords: Izmir, macro projects, macro investments, planning

Procedia PDF Downloads 576
15088 Consumption Habits of Low-Fat Plant Sterol-Enriched Yoghurt Enriched with Phytosterols

Authors: M. J. Reis Lima, J. Oliveira, A. C. Sousa Pereira, M. C. Castilho, E. Teixeira-Lemos

Abstract:

The increasing interest in plant sterol enriched foods is due to the fact that they reduce blood cholesterol concentrations without adverse side effects. In this context, enriched foods with phytosterols may be helpful in protecting population against atherosclerosis and cardiovascular diseases. The aim of the present work was to evaluate in a population of Viseu, Portugal, the consumption habits low-fat, plant sterol-enriched yoghurt. For this study, 577 inquiries were made and the sample was randomly selected for people shopping in various supermarkets. The preliminary results showed that the biggest consumers of these products were women aged 45 to 65 years old. Most of the people who claimed to buy these products consumed them once a day. Also, most of the consumers under antidyslipidemic therapeutics noticed positive effects on hypercholesterolemia.

Keywords: consumption habits, fermented milk, functional foods, low fat, phytosterols

Procedia PDF Downloads 442
15087 Identifying Knowledge Gaps in Incorporating Toxicity of Particulate Matter Constituents for Developing Regulatory Limits on Particulate Matter

Authors: Ananya Das, Arun Kumar, Gazala Habib, Vivekanandan Perumal

Abstract:

Regulatory bodies has proposed limits on Particulate Matter (PM) concentration in air; however, it does not explicitly indicate the incorporation of effects of toxicities of constituents of PM in developing regulatory limits. This study aimed to provide a structured approach to incorporate toxic effects of components in developing regulatory limits on PM. A four-step human health risk assessment framework consists of - (1) hazard identification (parameters: PM and its constituents and their associated toxic effects on health), (2) exposure assessment (parameters: concentrations of PM and constituents, information on size and shape of PM; fate and transport of PM and constituents in respiratory system), (3) dose-response assessment (parameters: reference dose or target toxicity dose of PM and its constituents), and (4) risk estimation (metric: hazard quotient and/or lifetime incremental risk of cancer as applicable). Then parameters required at every step were obtained from literature. Using this information, an attempt has been made to determine limits on PM using component-specific information. An example calculation was conducted for exposures of PM2.5 and its metal constituents from Indian ambient environment to determine limit on PM values. Identified data gaps were: (1) concentrations of PM and its constituents and their relationship with sampling regions, (2) relationship of toxicity of PM with its components.

Keywords: air, component-specific toxicity, human health risks, particulate matter

Procedia PDF Downloads 296
15086 Adaptive Conjoint Analysis of Professionals’ Job Preferences

Authors: N. Scheidegger, A. Mueller

Abstract:

Job preferences are a well-developed research field. Many studies analyze the preferences using simple ratings with a sample of university graduates. The current study analyzes the preferences with a mixed method approach of a qualitative preliminary study and adaptive conjoint-analysis. Preconditions of accepting job offers are clarified for professionals in the industrial sector. It could be shown that, e.g. wages above the average are critical and that career opportunities must be seen broader than merely a focus on formal personnel development programs. The results suggest that, to be effective with their recruitment efforts, employers must take into account key desirable job attributes of their target group.

Keywords: conjoint analysis, employer attractiveness, job preferences, personnel marketing

Procedia PDF Downloads 187
15085 Supersymmetry versus Compositeness: 2-Higgs Doublet Models Tell the Story

Authors: S. De Curtis, L. Delle Rose, S. Moretti, K. Yagyu

Abstract:

Supersymmetry and compositeness are the two prevalent paradigms providing both a solution to the hierarchy problem and motivation for a light Higgs boson state. An open door towards the solution is found in the context of 2-Higgs Doublet Models (2HDMs), which are necessary to supersymmetry and natural within compositeness in order to enable Electro-Weak Symmetry Breaking. In scenarios of compositeness, the two isospin doublets arise as pseudo Nambu-Goldstone bosons from the breaking of SO(6). By calculating the Higgs potential at one-loop level through the Coleman-Weinberg mechanism from the explicit breaking of the global symmetry induced by the partial compositeness of fermions and gauge bosons, we derive the phenomenological properties of the Higgs states and highlight the main signatures of this Composite 2-Higgs Doublet Model at the Large Hadron Collider. These include modifications to the SM-like Higgs couplings as well as production and decay channels of heavier Higgs bosons. We contrast the properties of this composite scenario to the well-known ones established in supersymmetry, with the MSSM being the most notorious example. We show how 2HDM spectra of masses and couplings accessible at the Large Hadron Collider may allow one to distinguish between the two paradigms.

Keywords: beyond the standard model, composite Higgs, supersymmetry, Two-Higgs Doublet Model

Procedia PDF Downloads 116
15084 Association of Temperature Factors with Seropositive Results against Selected Pathogens in Dairy Cow Herds from Central and Northern Greece

Authors: Marina Sofia, Alexios Giannakopoulos, Antonia Touloudi, Dimitris C Chatzopoulos, Zoi Athanasakopoulou, Vassiliki Spyrou, Charalambos Billinis

Abstract:

Fertility of dairy cattle can be affected by heat stress when the ambient temperature increases above 30°C and the relative humidity ranges from 35% to 50%. The present study was conducted on dairy cattle farms during summer months in Greece and aimed to identify the serological profile against pathogens that could affect fertility and to associate the positive serological results at herd level with temperature factors. A total of 323 serum samples were collected from clinically healthy dairy cows of 8 herds, located in Central and Northern Greece. ELISA tests were performed to detect antibodies against selected pathogens that affect fertility, namely Chlamydophila abortus, Coxiella burnetii, Neospora caninum, Toxoplasma gondii and Infectious Bovine Rhinotracheitis Virus (IBRV). Eleven climatic variables were derived from the WorldClim version 1.4. and ArcGIS V.10.1 software was used for analysis of the spatial information. Five different MaxEnt models were applied to associate the temperature variables with the locations of seropositive Chl. abortus, C. burnetii, N. caninum, T. gondii and IBRV herds (one for each pathogen). The logistic outputs were used for the interpretation of the results. ROC analyses were performed to evaluate the goodness of fit of the models’ predictions. Jackknife tests were used to identify the variables with a substantial contribution to each model. The seropositivity rates of pathogens varied among the 8 herds (0.85-4.76% for Chl. abortus, 4.76-62.71% for N. caninum, 3.8-43.47% for C. burnetii, 4.76-39.28% for T. gondii and 47.83-78.57% for IBRV). The variables of annual temperature range, mean diurnal range and maximum temperature of the warmest month gave a contribution to all five models. The regularized training gains, the training AUCs and the unregularized training gains were estimated. The mean diurnal range gave the highest gain when used in isolation and decreased the gain the most when it was omitted in the two models for seropositive Chl.abortus and IBRV herds. The annual temperature range increased the gain when used alone and decreased the gain the most when it was omitted in the models for seropositive C. burnetii, N. caninum and T. gondii herds. In conclusion, antibodies against Chl. abortus, C. burnetii, N. caninum, T. gondii and IBRV were detected in most herds suggesting circulation of pathogens that could cause infertility. The results of the spatial analyses demonstrated that the annual temperature range, mean diurnal range and maximum temperature of the warmest month could affect positively the possible pathogens’ presence. Acknowledgment: This research has been co‐financed by the European Regional Development Fund of the European Union and Greek national funds through the Operational Program Competitiveness, Entrepreneurship and Innovation, under the call RESEARCH–CREATE–INNOVATE (project code: T1EDK-01078).

Keywords: dairy cows, seropositivity, spatial analysis, temperature factors

Procedia PDF Downloads 184
15083 Development of Earthquake and Typhoon Loss Models for Japan, Specifically Designed for Underwriting and Enterprise Risk Management Cycles

Authors: Nozar Kishi, Babak Kamrani, Filmon Habte

Abstract:

Natural hazards such as earthquakes and tropical storms, are very frequent and highly destructive in Japan. Japan experiences, every year on average, more than 10 tropical cyclones that come within damaging reach, and earthquakes of moment magnitude 6 or greater. We have developed stochastic catastrophe models to address the risk associated with the entire suite of damaging events in Japan, for use by insurance, reinsurance, NGOs and governmental institutions. KCC’s (Karen Clark and Company) catastrophe models are procedures constituted of four modular segments: 1) stochastic events sets that would represent the statistics of the past events, hazard attenuation functions that could model the local intensity, vulnerability functions that would address the repair need for local buildings exposed to the hazard, and financial module addressing policy conditions that could estimates the losses incurring as result of. The events module is comprised of events (faults or tracks) with different intensities with corresponding probabilities. They are based on the same statistics as observed through the historical catalog. The hazard module delivers the hazard intensity (ground motion or wind speed) at location of each building. The vulnerability module provides library of damage functions that would relate the hazard intensity to repair need as percentage of the replacement value. The financial module reports the expected loss, given the payoff policies and regulations. We have divided Japan into regions with similar typhoon climatology, and earthquake micro-zones, within each the characteristics of events are similar enough for stochastic modeling. For each region, then, a set of stochastic events is developed that results in events with intensities corresponding to annual occurrence probabilities that are of interest to financial communities; such as 0.01, 0.004, etc. The intensities, corresponding to these probabilities (called CE, Characteristics Events) are selected through a superstratified sampling approach that is based on the primary uncertainty. Region specific hazard intensity attenuation functions followed by vulnerability models leads to estimation of repair costs. Extensive economic exposure model addresses all local construction and occupancy types, such as post-linter Shinand Okabe wood, as well as concrete confined in steel, SRC (Steel-Reinforced Concrete), high-rise.

Keywords: typhoon, earthquake, Japan, catastrophe modelling, stochastic modeling, stratified sampling, loss model, ERM

Procedia PDF Downloads 250
15082 Effect of Storey Number on Vierendeel Action in Progressive Collapse of RC Frames

Authors: Qian Huiya, Feng Lin

Abstract:

The progressive collapse of reinforced concrete (RC) structures will cause huge casualties and property losses. Therefore, it is necessary to evaluate the ability of structures against progressive collapse accurately. This paper numerically investigated the effect of storey number on the mechanism and quantitative contribution of the Vierendeel action (VA) in progressive collapse under corner column removal scenario. First, finite element (FE) models of multi-storey RC frame structures were developed using LS-DYNA. Then, the accuracy of the modeling technique was validated by test results conducted by the authors. Last, the validated FE models were applied to investigated the structural behavior of the RC frames with different storey numbers from one to six storeys. Results found the multi-storey substructure formed additional plastic hinges at the beam ends near the corner column in the second to top storeys, and at the lower end of the corner column in the first storey. The average ultimate resistance of each storey of the multi-storey substructures were increased by 14.0% to 18.5% compared with that of the single-storey substructure experiencing no VA. The contribution of VA to the ultimate resistance was decreased with the increase of the storey number.

Keywords: progressive collapse, reinforced concrete structure, storey number, Vierendeel action

Procedia PDF Downloads 49
15081 Forecasting Thermal Energy Demand in District Heating and Cooling Systems Using Long Short-Term Memory Neural Networks

Authors: Kostas Kouvaris, Anastasia Eleftheriou, Georgios A. Sarantitis, Apostolos Chondronasios

Abstract:

To achieve the objective of almost zero carbon energy solutions by 2050, the EU needs to accelerate the development of integrated, highly efficient and environmentally friendly solutions. In this direction, district heating and cooling (DHC) emerges as a viable and more efficient alternative to conventional, decentralized heating and cooling systems, enabling a combination of more efficient renewable and competitive energy supplies. In this paper, we develop a forecasting tool for near real-time local weather and thermal energy demand predictions for an entire DHC network. In this fashion, we are able to extend the functionality and to improve the energy efficiency of the DHC network by predicting and adjusting the heat load that is distributed from the heat generation plant to the connected buildings by the heat pipe network. Two case-studies are considered; one for Vransko, Slovenia and one for Montpellier, France. The data consists of i) local weather data, such as humidity, temperature, and precipitation, ii) weather forecast data, such as the outdoor temperature and iii) DHC operational parameters, such as the mass flow rate, supply and return temperature. The external temperature is found to be the most important energy-related variable for space conditioning, and thus it is used as an external parameter for the energy demand models. For the development of the forecasting tool, we use state-of-the-art deep neural networks and more specifically, recurrent networks with long-short-term memory cells, which are able to capture complex non-linear relations among temporal variables. Firstly, we develop models to forecast outdoor temperatures for the next 24 hours using local weather data for each case-study. Subsequently, we develop models to forecast thermal demand for the same period, taking under consideration past energy demand values as well as the predicted temperature values from the weather forecasting models. The contributions to the scientific and industrial community are three-fold, and the empirical results are highly encouraging. First, we are able to predict future thermal demand levels for the two locations under consideration with minimal errors. Second, we examine the impact of the outdoor temperature on the predictive ability of the models and how the accuracy of the energy demand forecasts decreases with the forecast horizon. Third, we extend the relevant literature with a new dataset of thermal demand and examine the performance and applicability of machine learning techniques to solve real-world problems. Overall, the solution proposed in this paper is in accordance with EU targets, providing an automated smart energy management system, decreasing human errors and reducing excessive energy production.

Keywords: machine learning, LSTMs, district heating and cooling system, thermal demand

Procedia PDF Downloads 128
15080 Land Degradation Vulnerability Modeling: A Study on Selected Micro Watersheds of West Khasi Hills Meghalaya, India

Authors: Amritee Bora, B. S. Mipun

Abstract:

Land degradation is often used to describe the land environmental phenomena that reduce land’s original productivity both qualitatively and quantitatively. The study of land degradation vulnerability primarily deals with “Environmentally Sensitive Areas” (ESA) and the amount of topsoil loss due to erosion. In many studies, it is observed that the assessment of the existing status of land degradation is used to represent the vulnerability. Moreover, it is also noticed that in most studies, the primary emphasis of land degradation vulnerability is to assess its sensitivity to soil erosion only. However, the concept of land degradation vulnerability can have different objectives depending upon the perspective of the study. It shows the extent to which changes in land use land cover can imprint their effect on the land. In other words, it represents the susceptibility of a piece of land to degrade its productive quality permanently or in the long run. It is also important to mention that the vulnerability of land degradation is not a single factor outcome. It is a probability assessment to evaluate the status of land degradation and needs to consider both biophysical and human induce parameters. To avoid the complexity of the previous models in this regard, the present study has emphasized on to generate a simplified model to assess the land degradation vulnerability in terms of its current human population pressure, land use practices, and existing biophysical conditions. It is a “Mixed-Method” termed as the land degradation vulnerability index (LDVi). It was originally inspired by the MEDALUS model (Mediterranean Desertification and Land Use), 1999, and Farazadeh’s 2007 revised version of it. It has followed the guidelines of Space Application Center, Ahmedabad / Indian Space Research Organization for land degradation vulnerability. The model integrates the climatic index (Ci), vegetation index (Vi), erosion index (Ei), land utilization index (Li), population pressure index (Pi), and cover management index (CMi) by giving equal weightage to each parameter. The final result shows that the very high vulnerable zone primarily indicates three (3) prominent circumstances; land under continuous population pressure, high concentration of human settlement, and high amount of topsoil loss due to surface runoff within the study sites. As all the parameters of the model are amalgamated with equal weightage further with the help of regression analysis, the LDVi model also provides a strong grasp of each parameter and how far they are competent to trigger the land degradation process.

Keywords: population pressure, land utilization, soil erosion, land degradation vulnerability

Procedia PDF Downloads 151
15079 The Effects of Dimethyl Adipate (DMA) on Coated Diesel Engine

Authors: Hanbey Hazar

Abstract:

An experimental study is conducted to evaluate the effects of using blends of diesel fuel with dimethyl adipate (DMA) in proportions of 2%, 6/%, and 12% on a coated engine. In this study, cylinder, piston, exhaust and inlet valves which are combustion chamber components have been coated with a ceramic material. Cylinder, exhaust and inlet valves of the diesel engine used in the tests were coated with ekabor-2 commercial powder, which is a ceramic material, to a thickness of 50 µm, by using the boriding method. The piston of a diesel engine was coated in 300 µm thickness with bor-based powder by using plasma coating method. Due to thermal barrier coating, the diesel engine's hazardous emission values decreased.

Keywords: diesel engine, dimethyl adipate (DMA), exhaust emissions, coating

Procedia PDF Downloads 262