Search results for: random effects panel ols model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26697

Search results for: random effects panel ols model

26007 Ground Motion Modelling in Bangladesh Using Stochastic Method

Authors: Mizan Ahmed, Srikanth Venkatesan

Abstract:

Geological and tectonic framework indicates that Bangladesh is one of the most seismically active regions in the world. The Bengal Basin is at the junction of three major interacting plates: the Indian, Eurasian, and Burma Plates. Besides there are many active faults within the region, e.g. the large Dauki fault in the north. The country has experienced a number of destructive earthquakes due to the movement of these active faults. Current seismic provisions of Bangladesh are mostly based on earthquake data prior to the 1990. Given the record of earthquakes post 1990, there is a need to revisit the design provisions of the code. This paper compares the base shear demand of three major cities in Bangladesh: Dhaka (the capital city), Sylhet, and Chittagong for earthquake scenarios of magnitudes 7.0MW, 7.5MW, 8.0MW and 8.5MW using a stochastic model. In particular, the stochastic model allows the flexibility to input region specific parameters such as shear wave velocity profile (that were developed from Global Crustal Model CRUST2.0) and include the effects of attenuation as individual components. Effects of soil amplification were analysed using the Extended Component Attenuation Model (ECAM). Results show that the estimated base shear demand is higher in comparison with code provisions leading to the suggestion of additional seismic design consideration in the study regions.

Keywords: attenuation, earthquake, ground motion, Stochastic, seismic hazard

Procedia PDF Downloads 243
26006 Remittances, Unemployement and Demographic Changes between Tunisia and Europe

Authors: Hajer Habib, Ghazi Boulila

Abstract:

The objective of this paper is to present our contribution to the theoretical literature through a simple theoretical model dealing with the effect of transferring funds on the labor market of the countries of origin and on the other hand to test this relationship empirically in the case of Tunisia. The methodology used consists of estimating a panel of the nine main destinations of the Tunisian diaspora in Europe between 1994 and 2014 in order to better value the net effect of these migratory financial flows on unemployment through population growth. The empirical results show that the main factors explaining the decision to emigrate are the economic factors related mainly to the income differential, the demographic factors related to the differential age structure of the origin and host populations, and the cultural factors linked basically to the mastery of the language. Indeed, the stock of migrants is one of the main determinants of the transfer of migratory funds to Tunisia. But there are other variables that do not lack importance such as the economic conditions linked by the host countries. This shows that Tunisian migrants react more to economic conditions in European countries than in Tunisia. The economic situation of European countries dominates the numbers of emigrants as an explanatory factor for the amount of transfers from Tunisian emigrants to their country of origin. Similarly, it is clear that there is an indirect effect of transfers on unemployment in Tunisia. This suggests that the demographic transition conditions the effects of transferring funds on the level of unemployment.

Keywords: demographic changes, international migration, labor market, remittances

Procedia PDF Downloads 145
26005 Optimization of Reliability and Communicability of a Random Two-Dimensional Point Patterns Using Delaunay Triangulation

Authors: Sopheak Sorn, Kwok Yip Szeto

Abstract:

Reliability is one of the important measures of how well the system meets its design objective, and mathematically is the probability that a complex system will perform satisfactorily. When the system is described by a network of N components (nodes) and their L connection (links), the reliability of the system becomes a network design problem that is an NP-hard combinatorial optimization problem. In this paper, we address the network design problem for a random point set’s pattern in two dimensions. We make use of a Voronoi construction with each cell containing exactly one point in the point pattern and compute the reliability of the Voronoi’s dual, i.e. the Delaunay graph. We further investigate the communicability of the Delaunay network. We find that there is a positive correlation and a negative correlation between the homogeneity of a Delaunay's degree distribution with its reliability and its communicability respectively. Based on the correlations, we alter the communicability and the reliability by performing random edge flips, which preserve the number of links and nodes in the network but can increase the communicability in a Delaunay network at the cost of its reliability. This transformation is later used to optimize a Delaunay network with the optimum geometric mean between communicability and reliability. We also discuss the importance of the edge flips in the evolution of real soap froth in two dimensions.

Keywords: Communicability, Delaunay triangulation, Edge Flip, Reliability, Two dimensional network, Voronio

Procedia PDF Downloads 414
26004 Diagnostic Clinical Skills in Cardiology: Improving Learning and Performance with Hybrid Simulation, Scripted Histories, Wearable Technology, and Quantitative Grading – The Assimilate Excellence Study

Authors: Daly M. J, Condron C, Mulhall C, Eppich W, O'Neill J.

Abstract:

Introduction: In contemporary clinical cardiology, comprehensive and holistic bedside evaluation including accurate cardiac auscultation is in decline despite having positive effects on patients and their outcomes. Methods: Scripted histories and scoring checklists for three clinical scenarios in cardiology were co-created and refined through iterative consensus by a panel of clinical experts; these were then paired with recordings of auscultatory findings from three actual patients with known valvular heart disease. A wearable vest with embedded pressure-sensitive panel speakers was developed to transmit these recordings when examined at the standard auscultation points. RCSI medical students volunteered for a series of three formative long case examinations in cardiology (LC1 – LC3) using this hybrid simulation. Participants were randomised into two groups: Group 1 received individual teaching from an expert trainer between LC1 and LC2; Group 2 received the same intervention between LC2 and LC3. Each participant’s long case examination performance was recorded and blindly scored by two peer participants and two RCSI examiners. Results: Sixty-eight participants were included in the study (age 27.6 ± 0.1 years; 74% female) and randomised into two groups; there were no significant differences in baseline characteristics between groups. Overall, the median total faculty examiner score was 39.8% (35.8 – 44.6%) in LC1 and increased to 63.3% (56.9 – 66.4%) in LC3, with those in Group 1 showing a greater improvement in LC2 total score than that observed in Group 2 (p < .001). Using the novel checklist, intraclass correlation coefficients (ICC) were excellent between examiners in all cases: ICC .994 – .997 (p < .001); correlation between peers and examiners improved in LC2 following peer grading of LC1 performances: ICC .857 – .867 (p < .001). Conclusion: Hybrid simulation and quantitative grading improve learning, standardisation of assessment, and direct comparisons of both performance and acumen in clinical cardiology.

Keywords: cardiology, clinical skills, long case examination, hybrid simulation, checklist

Procedia PDF Downloads 104
26003 Nonlinear Analysis of Steel Fiber Reinforced Concrete Frames Considering Shear Behaviour of Members under Varying Axial Load

Authors: Habib Akbarzadeh Bengar, Mohammad Asadi Kiadehi, Ali Rameeh

Abstract:

The result of the past earthquakes has shown that insufficient amount of stirrups and brittle behavior of concrete lead to the shear and flexural failure in reinforced concrete (RC) members. In this paper, an analytical model proposed to predict the nonlinear behavior of RC and SFRC elements and frames. In this model, some important parameter such as shear effect, varying axial load, and longitudinal bar buckling are considered. The results of analytical model were verified with experimental tests. The results of verification have shown that the proposed analytical model can predict the nonlinear behavior of RC and SFRC members and also frames accurately. In addition, the results have shown that use of steel fibers increased bearing capacity and ductility of RC frame. Due to this enhancement in shear strength and ductility, insufficient amount of stirrups, which resulted in shear failure, can be offset with usage of the steel fibers. In addition to the steps taken, to analyze the effects of fibers percentages on the bearing capacity and ductility of frames parametric studies have been performed to investigate of these effects.

Keywords: nonlinear analysis, SFRC frame, shear failure, varying an axial load

Procedia PDF Downloads 212
26002 Research on Intercity Travel Mode Choice Behavior Considering Traveler’s Heterogeneity and Psychological Latent Variables

Authors: Yue Huang, Hongcheng Gan

Abstract:

The new urbanization pattern has led to a rapid growth in demand for short-distance intercity travel, and the emergence of new travel modes has also increased the variety of intercity travel options. In previous studies on intercity travel mode choice behavior, the impact of functional amenities of travel mode and travelers’ long-term personality characteristics has rarely been considered, and empirical results have typically been calibrated using revealed preference (RP) or stated preference (SP) data. This study designed a questionnaire that combines the RP and SP experiment from the perspective of a trip chain combining inner-city and intercity mobility, with consideration for the actual condition of the Huainan-Hefei traffic corridor. On the basis of RP/SP fusion data, a hybrid choice model considering both random taste heterogeneity and psychological characteristics was established to investigate travelers’ mode choice behavior for traditional train, high-speed rail, intercity bus, private car, and intercity online car-hailing. The findings show that intercity time and cost exert the greatest influence on mode choice, with significant heterogeneity across the population. Although inner-city cost does not demonstrate a significant influence, inner-city time plays an important role. Service attributes of travel mode, such as catering and hygiene services, as well as free wireless network supply, only play a minor role in mode selection. Finally, our study demonstrates that safety-seeking tendency, hedonism, and introversion all have differential and significant effects on intercity travel mode choice.

Keywords: intercity travel mode choice, stated preference survey, hybrid choice model, RP/SP fusion data, psychological latent variable, heterogeneity

Procedia PDF Downloads 104
26001 Reliability Modeling on Drivers’ Decision during Yellow Phase

Authors: Sabyasachi Biswas, Indrajit Ghosh

Abstract:

The random and heterogeneous behavior of vehicles in India puts up a greater challenge for researchers. Stop-and-go modeling at signalized intersections under heterogeneous traffic conditions has remained one of the most sought-after fields. Vehicles are often caught up in the dilemma zone and are unable to take quick decisions whether to stop or cross the intersection. This hampers the traffic movement and may lead to accidents. The purpose of this work is to develop a stop and go prediction model that depicts the drivers’ decision during the yellow time at signalised intersections. To accomplish this, certain traffic parameters were taken into account to develop surrogate model. This research investigated the Stop and Go behavior of the drivers by collecting data from 4-signalized intersections located in two major Indian cities. Model was developed to predict the drivers’ decision making during the yellow phase of the traffic signal. The parameters used for modeling included distance to stop line, time to stop line, speed, and length of the vehicle. A Kriging base surrogate model has been developed to investigate the drivers’ decision-making behavior in amber phase. It is observed that the proposed approach yields a highly accurate result (97.4 percent) by Gaussian function. It was observed that the accuracy for the crossing probability was 95.45, 90.9 and 86.36.11 percent respectively as predicted by the Kriging models with Gaussian, Exponential and Linear functions.

Keywords: decision-making decision, dilemma zone, surrogate model, Kriging

Procedia PDF Downloads 305
26000 Buckling Resistance of GFRP Sandwich Infill Panels with Different Cores under Increased Temperatures

Authors: WooYoung Jung, V. Sim

Abstract:

This paper presents numerical analysis in terms of buckling resistance strength of polymer matrix composite (PMC) infill panels system under the influence of temperature on the foam core. Failure mode under in-plane compression is investigated by means of numerical analysis with ABAQUS platform. Parameters considered in this study are contact length and both the type of foam for core and the variation of its Young's Modulus under the thermal influence. Variation of temperature is considered in static cases and only applied to core. Indeed, it is shown that the effect of temperature on the panel system mechanical properties is significance. Moreover, the variations of temperature result in the decrements of the system strength. This is due to the polymeric nature of this material. Additionally, the contact length also displays the effect on performance of infill panel. Their significance factors are based on type of polymer for core. Hence, by comparing difference type of core material, the variation can be reducing.

Keywords: buckling, contact length, foam core, temperature dependent

Procedia PDF Downloads 285
25999 A Product-Specific/Unobservable Approach to Segmentation for a Value Expressive Credit Card Service

Authors: Manfred F. Maute, Olga Naumenko, Raymond T. Kong

Abstract:

Using data from a nationally representative financial panel of Canadian households, this study develops a psychographic segmentation of the customers of a value-expressive credit card service and tests for effects on relational response differences. The variety of segments elicited by agglomerative and k means clustering and the familiar profiles of individual clusters suggest that the face validity of the psychographic segmentation was quite high. Segmentation had a significant effect on customer satisfaction and relationship depth. However, when socio-demographic characteristics like household size and income were accounted for in the psychographic segmentation, the effect on relational response differences was magnified threefold. Implications for the segmentation of financial services markets are considered.

Keywords: customer satisfaction, financial services, psychographics, response differences, segmentation

Procedia PDF Downloads 326
25998 Effects of Manufacture and Assembly Errors on the Output Error of Globoidal Cam Mechanisms

Authors: Shuting Ji, Yueming Zhang, Jing Zhao

Abstract:

The output error of the globoidal cam mechanism can be considered as a relevant indicator of mechanism performance, because it determines kinematic and dynamical behavior of mechanical transmission. Based on the differential geometry and the rigid body transformations, the mathematical model of surface geometry of the globoidal cam is established. Then we present the analytical expression of the output error (including the transmission error and the displacement error along the output axis) by considering different manufacture and assembly errors. The effects of the center distance error, the perpendicular error between input and output axes and the rotational angle error of the globoidal cam on the output error are systematically analyzed. A globoidal cam mechanism which is widely used in automatic tool changer of CNC machines is applied for illustration. Our results show that the perpendicular error and the rotational angle error have little effects on the transmission error but have great effects on the displacement error along the output axis. This study plays an important role in the design, manufacture and assembly of the globoidal cam mechanism.

Keywords: globoidal cam mechanism, manufacture error, transmission error, automatic tool changer

Procedia PDF Downloads 566
25997 Conflict and Hunger Revisit: Evidences from Global Surveys, 1989-2020

Authors: Manasse Elusma, Thung-Hong Lin, Chun-yin Lee

Abstract:

The relationship between hunger and war or conflict remains to be discussed. Do wars or conflicts cause hunger and food scarcity, or is the reverse relationship is true? As the world becomes more peaceful and wealthier, some countries are still suffered from hunger and food shortage. So, eradicating hunger calls for a more comprehensive understanding of the relationship between conflict and hunger. Several studies are carried out to detect the importance of conflict or war on food security. Most of these studies, however, perform only descriptive analysis and largely use food security indicators instead of the global hunger index. Few studies have employed cross-country panel data to explicitly analyze the association between conflict and chronic hunger, including hidden hunger. Herein, this study addresses this issue and the knowledge gap. We combine global datasets to build a new panel dataset including 143 countries from 1989 to 2020. This study examines the effect of conflict on hunger with fixed effect models, and the results show that the increase of conflict frequency deteriorates hunger. Peacebuilding efforts and war prevention initiative are required to eradicate global hunger.

Keywords: armed conflict, food scarcity, hidden hunger, hunger, malnutrition

Procedia PDF Downloads 164
25996 The Role of Human Capital in the Evolution of Inequality and Economic Growth in Latin-America

Authors: Luis Felipe Brito-Gaona, Emma M. Iglesias

Abstract:

There is a growing literature that studies the main determinants and drivers of inequality and economic growth in several countries, using panel data and different estimation methods (fixed effects, Generalized Methods of Moments (GMM) and Two Stages Least Squares (TSLS)). Recently, it was studied the evolution of these variables in the period 1980-2009 in the 18 countries of Latin-America and it was found that one of the main variables that explained their evolution was Foreign Direct Investment (FDI). We extend this study to the year 2015 in the same 18 countries in Latin-America, and we find that FDI does not have a significant role anymore, while we find a significant negative and positive effect of schooling levels on inequality and economic growth respectively. We also find that the point estimates associated with human capital are the largest ones of the variables included in the analysis, and this means that an increase in human capital (measured by schooling levels of secondary education) is the main determinant that can help to reduce inequality and to increase economic growth in Latin-America. Therefore, we advise that economic policies in Latin-America should be directed towards increasing the level of education. We use the methodologies of estimating by fixed effects, GMM and TSLS to check the robustness of our results. Our conclusion is the same regardless of the estimation method we choose. We also find that the international recession in the Latin-American countries in 2008 reduced significantly their economic growth.

Keywords: economic growth, human capital, inequality, Latin-America

Procedia PDF Downloads 219
25995 Fish Scales as a Nonlethal Screening Tools for Assessing the Effects of Surface Water Contaminants in Cyprinus Carpio

Authors: Shahid Mahboob, Hafiz Muhammad Ashraf, Salma Sultana, Tayyaba Sultana, Khalid Al-Ghanim, Fahid Al-Misned, Zubair Ahmedd

Abstract:

There is an increasing need for an effective tool to estimate the risks derived from the large number of pollutants released to the environment by human activities. Typical screening procedures are highly invasive or lethal to the fish. Recent studies show that fish scales biochemically respond to a range of contaminants, including toxic metals, organic compounds, and endocrine disruptors. The present study evaluated the effects of the surface water contaminants on Cyprinus carpio in the Ravi River by comparing DNA extracted non-lethally from their scales to DNA extracted from the scales of fish collected from a controlled fish farm. A single, random sampling was conducted. Fish were broadly categorised into three weight categories (W1, W2 and W3). The experimental samples in the W1, W2 and W3 categories had an average DNA concentration (µg/µl) that was lower than the control samples. All control samples had a single DNA band; whereas the experimental samples in W1 fish had 1 to 2 bands, the experimental samples in W2 fish had two bands and the experimental samples in W3 fish had fragmentation in the form of three bands. These bands exhibit the effects of pollution on fish in the Ravi River. On the basis findings of this study, we propose that fish scales can be successfully employed as a new non-lethal tool for the evaluation of the effect of surface water contaminants.

Keywords: fish scales, Cyprinus carpio, heavy metals, non-invasive, DNA fragmentation

Procedia PDF Downloads 405
25994 Long Term Survival after a First Transient Ischemic Attack in England: A Case-Control Study

Authors: Padma Chutoo, Elena Kulinskaya, Ilyas Bakbergenuly, Nicholas Steel, Dmitri Pchejetski

Abstract:

Transient ischaemic attacks (TIAs) are warning signs for future strokes. TIA patients are at increased risk of stroke and cardio-vascular events after a first episode. A majority of studies on TIA focused on the occurrence of these ancillary events after a TIA. Long-term mortality after TIA received only limited attention. We undertook this study to determine the long-term hazards of all-cause mortality following a first episode of a TIA using anonymised electronic health records (EHRs). We used a retrospective case-control study using electronic primary health care records from The Health Improvement Network (THIN) database. Patients born prior to or in year 1960, resident in England, with a first diagnosis of TIA between January 1986 and January 2017 were matched to three controls on age, sex and general medical practice. The primary outcome was all-cause mortality. The hazards of all-cause mortality were estimated using a time-varying Weibull-Cox survival model which included both scale and shape effects and a random frailty effect of GP practice. 20,633 cases and 58,634 controls were included. Cases aged 39 to 60 years at the first TIA event had the highest hazard ratio (HR) of mortality compared to matched controls (HR = 3.04, 95% CI (2.91 - 3.18)). The HRs for cases aged 61-70 years, 71-76 years and 77+ years were 1.98 (1.55 - 2.30), 1.79 (1.20 - 2.07) and 1.52 (1.15 - 1.97) compared to matched controls. Aspirin provided long-term survival benefits to cases. Cases aged 39-60 years on aspirin had HR of 0.93 (0.84 - 1.00), 0.90 (0.82 - 0.98) and 0.88 (0.80 - 0.96) at 5 years, 10 years and 15 years, respectively, compared to cases in the same age group who were not on antiplatelets. Similar beneficial effects of aspirin were observed in other age groups. There were no significant survival benefits with other antiplatelet options. No survival benefits of antiplatelet drugs were observed in controls. Our study highlights the excess long-term risk of death of TIA patients and cautions that TIA should not be treated as a benign condition. The study further recommends aspirin as the better option for secondary prevention for TIA patients compared to clopidogrel recommended by NICE guidelines. Management of risk factors and treatment strategies should be important challenges to reduce the burden of disease.

Keywords: dual antiplatelet therapy (DAPT), General Practice, Multiple Imputation, The Health Improvement Network(THIN), hazard ratio (HR), Weibull-Cox model

Procedia PDF Downloads 140
25993 Rheology and Structural Arrest of Dense Dairy Suspensions: A Soft Matter Approach

Authors: Marjan Javanmard

Abstract:

The rheological properties of dairy products critically depend on the underlying organisation of proteins at multiple length scales. When heated and acidified, milk proteins form particle gel that is viscoelastic, solvent rich, ‘soft’ material. In this work recent developments on the rheology of soft particles suspensions were used to interpret and potentially define the properties of dairy gel structures. It is discovered that at volume fractions below random close packing (RCP), the Maron-Pierce-Quemada (MPQ) model accurately predicts the viscosity of the dairy gel suspensions without fitting parameters; the MPQ model has been shown previously to provide reasonable predictions of the viscosity of hard sphere suspensions from the volume fraction, solvent viscosity and RCP. This surprising finding demonstrates that up to RCP, the dairy gel system behaves as a hard sphere suspension and that the structural aggregates behave as discrete particulates akin to what is observed for microgel suspensions. At effective phase volumes well above RCP, the system is a soft solid. In this region, it is discovered that the storage modulus of the sheared AMG scales with the storage modulus of the set gel. The storage modulus in this regime is reasonably well described as a function of effective phase volume by the Evans and Lips model. Findings of this work has potential to aid in rational design and control of dairy food structure-properties.

Keywords: dairy suspensions, rheology-structure, Maron-Pierce-Quemada Model, Evans and Lips Model

Procedia PDF Downloads 216
25992 Analysis of Performance of 3T1D Dynamic Random-Access Memory Cell

Authors: Nawang Chhunid, Gagnesh Kumar

Abstract:

On-chip memories consume a significant portion of the overall die space and power in modern microprocessors. On-chip caches depend on Static Random-Access Memory (SRAM) cells and scaling of technology occurring as per Moore’s law. Unfortunately, the scaling is affecting stability, performance, and leakage power which will become major problems for future SRAMs in aggressive nanoscale technologies due to increasing device mismatch and variations. 3T1D Dynamic Random-Access Memory (DRAM) cell is a non-destructive read DRAM cell with three transistors and a gated diode. In 3T1D DRAM cell gated diode (D1) acts as a storage device and also as an amplifier, which leads to fast read access. Due to its high tolerance to process variation, high density, and low cost of memory as compared to 6T SRAM cell, it is universally used by the advanced microprocessor for on chip data and program memory. In the present paper, it has been shown that 3T1D DRAM cell can perform better in terms of fast read access as compared to 6T, 4T, 3T SRAM cells, respectively.

Keywords: DRAM Cell, Read Access Time, Retention Time, Average Power dissipation

Procedia PDF Downloads 306
25991 Cyclic Loading Tests of Reinforced Concrete Frame Structures Strengthened by Externally-Anchored Precast Wall-Panel

Authors: Seung-Ho Choi, Jae Yuel Oh, Chi Sung Lim, Ho Seong Jung, Kang Su Kim

Abstract:

In recent years, various strengthening methods for buildings have been developed, but most of them require quite a long construction period during which the building users need to be patient on uncomfortable working environments including various lousy noises or even evacuation of the buildings. In this study, externally anchored precast wall-panel method (EPCW) for strengthening non-seismic reinforced concrete (RC) structures has been proposed, which is occupant-friendly technique because the strengthening walls are manufactured at factory and can be tightened to the members very quickly at the site. In order to investigate the structural performance of the specimens strengthened by the EPCW method, a total of four specimens were fabricated, and tested under axial and reversed cyclic lateral loads. The test results showed that the lateral resistances of the specimens strengthened by the EPCW method were greatly enhanced in both positive and negative directions, compared to the RC specimen having non-seismic details.

Keywords: precast wall, seismic strengthening, reinforced concrete, externally-anchored

Procedia PDF Downloads 292
25990 A Machine Learning Approach for Efficient Resource Management in Construction Projects

Authors: Soheila Sadeghi

Abstract:

Construction projects are complex and often subject to significant cost overruns due to the multifaceted nature of the activities involved. Accurate cost estimation is crucial for effective budget planning and resource allocation. Traditional methods for predicting overruns often rely on expert judgment or analysis of historical data, which can be time-consuming, subjective, and may fail to consider important factors. However, with the increasing availability of data from construction projects, machine learning techniques can be leveraged to improve the accuracy of overrun predictions. This study applied machine learning algorithms to enhance the prediction of cost overruns in a case study of a construction project. The methodology involved the development and evaluation of two machine learning models: Random Forest and Neural Networks. Random Forest can handle high-dimensional data, capture complex relationships, and provide feature importance estimates. Neural Networks, particularly Deep Neural Networks (DNNs), are capable of automatically learning and modeling complex, non-linear relationships between input features and the target variable. These models can adapt to new data, reduce human bias, and uncover hidden patterns in the dataset. The findings of this study demonstrate that both Random Forest and Neural Networks can significantly improve the accuracy of cost overrun predictions compared to traditional methods. The Random Forest model also identified key cost drivers and risk factors, such as changes in the scope of work and delays in material delivery, which can inform better project risk management. However, the study acknowledges several limitations. First, the findings are based on a single construction project, which may limit the generalizability of the results to other projects or contexts. Second, the dataset, although comprehensive, may not capture all relevant factors influencing cost overruns, such as external economic conditions or political factors. Third, the study focuses primarily on cost overruns, while schedule overruns are not explicitly addressed. Future research should explore the application of machine learning techniques to a broader range of projects, incorporate additional data sources, and investigate the prediction of both cost and schedule overruns simultaneously.

Keywords: resource allocation, machine learning, optimization, data-driven decision-making, project management

Procedia PDF Downloads 29
25989 Investigate the Movement of Salt-Wedge at Co Chien Estuary, Vietnam in the Context of Climate Change and Reduce Upstream Flow Using 3D Model

Authors: Hieu Duy Nguyen, Chitsan Lin Jr., Dung Duc Tran

Abstract:

Nowadays, drought and salinity intrusion becomes a severe problem in the Lower Mekong region due to climate change, especially in coastal provinces. Freshwater resources are decreased due to sea-level rise and the decline in water flow from upstream in the dry season. The combination of the above issues can lead to many effects on the environment and human health in affected areas such as the pathological of digestive or decreased the immune system. Tidal cycle and upstream flows are the two main factors affecting the saline intrusion process and the former salt-wedge in the estuary. Under suitable conditions, salt-wedge can be going further upstream under the water surface and affected groundwater. In order to have a proper plan for the mitigation of the above adverse effects, we need to understand the characteristics of this process. In this study, 3D model is used to investigate the movement of salt-wedge under different conditions of tidal and flow discharge. The salinity in the vertical profile is also measured in the dry season of 2017 and 2018 for model calibrating. The data has proved that there is the presence of salt-wedge in the study area. The obtained results will help strategic planners to use and preserve water resources more effectively and serve as a basis for new research directions on saline intrusion and human health.

Keywords: salt-wedge, salinity intrusion, human health, 3D model

Procedia PDF Downloads 105
25988 A Nexus between Financial Development and Its Determinants: A Panel Data Analysis from a Global Perspective

Authors: Bilal Ashraf, Qianxiao Zhang

Abstract:

This study empirically investigated the linkage amid financial development and its important determinants such as information and communication technology, natural resource rents, economic growth, current account balance, and gross savings in 107 economies. This paper preferred to employ the second-generation unit root tests to handle the issues of slope heterogeneity and “cross-sectional dependence” in panel data. The “Kao, Pedroni, and Westerlund tests” confirm the long-lasting connections among the variables under study, while the significant endings of “cross-sectionally augmented autoregressive distributed lag (CS-ARDL)” exposed that NRR, CAB, and S negatively affected the financial development while ICT and EG stimulates the procedure of FD. Further, the robustness analysis's application of FGLS supports the appropriateness and applicability of CS-ARDL. Finally, the findings of “DH causality analysis” endorse the bidirectional causality linkages amongst research factors. Based on the study's outcomes, we suggest some policy suggestions that empower the process of financial development, globally.

Keywords: determinants of financial developments, CS-ARDL, financial development, global sample, causality analysis

Procedia PDF Downloads 51
25987 Predication Model for Leukemia Diseases Based on Data Mining Classification Algorithms with Best Accuracy

Authors: Fahd Sabry Esmail, M. Badr Senousy, Mohamed Ragaie

Abstract:

In recent years, there has been an explosion in the rate of using technology that help discovering the diseases. For example, DNA microarrays allow us for the first time to obtain a "global" view of the cell. It has great potential to provide accurate medical diagnosis, to help in finding the right treatment and cure for many diseases. Various classification algorithms can be applied on such micro-array datasets to devise methods that can predict the occurrence of Leukemia disease. In this study, we compared the classification accuracy and response time among eleven decision tree methods and six rule classifier methods using five performance criteria. The experiment results show that the performance of Random Tree is producing better result. Also it takes lowest time to build model in tree classifier. The classification rules algorithms such as nearest- neighbor-like algorithm (NNge) is the best algorithm due to the high accuracy and it takes lowest time to build model in classification.

Keywords: data mining, classification techniques, decision tree, classification rule, leukemia diseases, microarray data

Procedia PDF Downloads 317
25986 Macroeconomic Implications of Artificial Intelligence on Unemployment in Europe

Authors: Ahmad Haidar

Abstract:

Modern economic systems are characterized by growing complexity, and addressing their challenges requires innovative approaches. This study examines the implications of artificial intelligence (AI) on unemployment in Europe from a macroeconomic perspective, employing data modeling techniques to understand the relationship between AI integration and labor market dynamics. To understand the AI-unemployment nexus comprehensively, this research considers factors such as sector-specific AI adoption, skill requirements, workforce demographics, and geographical disparities. The study utilizes a panel data model, incorporating data from European countries over the last two decades, to explore the potential short-term and long-term effects of AI implementation on unemployment rates. In addition to investigating the direct impact of AI on unemployment, the study also delves into the potential indirect effects and spillover consequences. It considers how AI-driven productivity improvements and cost reductions might influence economic growth and, in turn, labor market outcomes. Furthermore, it assesses the potential for AI-induced changes in industrial structures to affect job displacement and creation. The research also highlights the importance of policy responses in mitigating potential negative consequences of AI adoption on unemployment. It emphasizes the need for targeted interventions such as skill development programs, labor market regulations, and social safety nets to enable a smooth transition for workers affected by AI-related job displacement. Additionally, the study explores the potential role of AI in informing and transforming policy-making to ensure more effective and agile responses to labor market challenges. In conclusion, this study provides a comprehensive analysis of the macroeconomic implications of AI on unemployment in Europe, highlighting the importance of understanding the nuanced relationships between AI adoption, economic growth, and labor market outcomes. By shedding light on these relationships, the study contributes valuable insights for policymakers, educators, and researchers, enabling them to make informed decisions in navigating the complex landscape of AI-driven economic transformation.

Keywords: artificial intelligence, unemployment, macroeconomic analysis, european labor market

Procedia PDF Downloads 69
25985 Stock Market Developments, Income Inequality, Wealth Inequality

Authors: Quang Dong Dang

Abstract:

This paper examines the possible effects of stock market developments by channels on income and wealth inequality. We use the Bayesian Multilevel Model with the explanatory variables of the market’s channels, such as accessibility, efficiency, and market health in six selected countries: the US, UK, Japan, Vietnam, Thailand, and Malaysia. We found that generally, the improvements in the stock market alleviate income inequality. However, stock market expansions in higher-income countries are likely to trigger income inequality. We also found that while enhancing the quality of channels of the stock market has counter-effects on wealth equality distributions, open accessibilities help reduce wealth inequality distributions within the scope of the study. In addition, the inverted U-shaped hypothesis seems not to be valid in six selected countries between the period from 2006 to 2020.

Keywords: Bayesian multilevel model, income inequality, inverted u-shaped hypothesis, stock market development, wealth inequality

Procedia PDF Downloads 104
25984 Modeling Spillover Effects of Pakistan-India Bilateral Trade upon Sustainability of Economic Growth in Pakistan

Authors: Taimoor Hussain Alvi, Syed Toqueer Akhter

Abstract:

The focus of this research is to identify Pak-India bilateral trade spillover effects upon Pakistan’s Growth rate. Cross-country spillover growth Effects have been linked with openness and access to markets. In this research, we intend to see the short run and long run effects of Pak-India Bilateral Trade Openness upon economic growth in Pakistan. Trade Openness has been measured as the sum of bilateral exports and imports between the two countries. Increased emphasis on the condition and environment of financial markets is laid in light of globalization and trade liberalization. This research paper makes use of the Univariate Autoregressive Distributed Lagged Model to analyze the effects of bilateral trade variables upon the growth pattern of Pakistan in the short run and long run. Key findings of the study empirically support the notion that increased bilateral trade will be beneficial for Pakistan in the short run because of cost advantage and knowledge spillover in terms of increased technical and managerial ability from multinational firms. However, contrary to extensive literature, increased bilateral trade measures will affect Pakistan’s growth rate negatively in the long run because of the industrial size differential and increased integration of Indian economy with the world.

Keywords: bilateral trade openness, spillover, comparative advantage, univariate

Procedia PDF Downloads 476
25983 Numerical Investigation of Wastewater ‎Rheological Characteristics on Flow Field ‎Inside a Sewage Network

Authors: Seyed-Mohammad-Kazem Emami, Behrang Saki, Majid Mohammadian

Abstract:

The wastewater flow field inside a sewage network including pipe and ‎manhole was investigated using a Computational Fluid Dynamics ‎‎(CFD) model. The numerical model is developed by incorporating a ‎rheological model to calculate the viscosity of wastewater fluid by ‎means of open source toolbox OpenFOAM. The rheological ‎properties of prepared wastewater fluid suspensions are first measured ‎using a BrookField LVDVII Pro+ viscometer with an enhanced UL ‎adapter and then correlated the suitable rheological viscosity model ‎values from the measured rheological properties. The results show the ‎significant effects of rheological characteristics of wastewater fluid on ‎the flow domain of sewer system. Results were compared and ‎discussed with the commonly used Newtonian model to evaluate the ‎differences for velocity profile, pressure and shear stress. ‎

Keywords: Non-Newtonian flows, Wastewater, Numerical simulation, Rheology, Sewage Network

Procedia PDF Downloads 123
25982 Taylor’s Law and Relationship between Life Expectancy at Birth and Variance in Age at Death in Period Life Table

Authors: David A. Swanson, Lucky M. Tedrow

Abstract:

Taylor’s Law is a widely observed empirical pattern that relates variances to means in sets of non-negative measurements via an approximate power function, which has found application to human mortality. This study adds to this research by showing that Taylor’s Law leads to a model that reasonably describes the relationship between life expectancy at birth (e0, which also is equal to mean age at death in a life table) and variance at age of death in seven World Bank regional life tables measured at two points in time, 1970 and 2000. Using as a benchmark a non-random sample of four Japanese female life tables covering the period from 1950 to 2004, the study finds that the simple linear model provides reasonably accurate estimates of variance in age at death in a life table from e0, where the latter range from 60.9 to 85.59 years. Employing 2017 life tables from the Human Mortality Database, the simple linear model is used to provide estimates of variance at age in death for six countries, three of which have high e0 values and three of which have lower e0 values. The paper provides a substantive interpretation of Taylor’s Law relative to e0 and concludes by arguing that reasonably accurate estimates of variance in age at death in a period life table can be calculated using this approach, which also can be used where e0 itself is estimated rather than generated through the construction of a life table, a useful feature of the model.

Keywords: empirical pattern, mean age at death in a life table, mean age of a stationary population, stationary population

Procedia PDF Downloads 325
25981 Comparative Analysis of Predictive Models for Customer Churn Prediction in the Telecommunication Industry

Authors: Deepika Christopher, Garima Anand

Abstract:

To determine the best model for churn prediction in the telecom industry, this paper compares 11 machine learning algorithms, namely Logistic Regression, Support Vector Machine, Random Forest, Decision Tree, XGBoost, LightGBM, Cat Boost, AdaBoost, Extra Trees, Deep Neural Network, and Hybrid Model (MLPClassifier). It also aims to pinpoint the top three factors that lead to customer churn and conducts customer segmentation to identify vulnerable groups. According to the data, the Logistic Regression model performs the best, with an F1 score of 0.6215, 81.76% accuracy, 68.95% precision, and 56.57% recall. The top three attributes that cause churn are found to be tenure, Internet Service Fiber optic, and Internet Service DSL; conversely, the top three models in this article that perform the best are Logistic Regression, Deep Neural Network, and AdaBoost. The K means algorithm is applied to establish and analyze four different customer clusters. This study has effectively identified customers that are at risk of churn and may be utilized to develop and execute strategies that lower customer attrition.

Keywords: attrition, retention, predictive modeling, customer segmentation, telecommunications

Procedia PDF Downloads 53
25980 The Effect of Expressive Therapies on Children and Youth Impacted by Refugee Trauma: A Meta-Analysis

Authors: Brian Kristopher Cambra

Abstract:

Millions of displaced families are seeking refuge in countries that are not their own due to war, violence, persecution, political unrest, and natural disasters. This global crisis is forcing researchers and practitioners to consider how refugees are coping with the trauma associated with their migration process. Effective therapeutic approaches are needed in a global effort to address the traumatic impact of forced migration. This meta-analytical study investigates the effectiveness of expressive therapeutic modalities, including play, art, music, sandplay, theatre, and writing therapies, in helping children and adolescents cope with refugee trauma. Seventeen pre-post and between-group comparison studies were analyzed using a random-effects model. The combined effect size for pre-post comparisons was medium (g = 0.58), whereas the combined effect size for between-group comparisons was small (g = 0.32). Overall, art therapy was found to be most effective in treating stress symptoms. Heterogeneity tests, however, suggest effect sizes cannot be interpreted as meaningful due to substantial variance. Nevertheless, findings of this meta-analysis indicate that expressive therapies may be among beneficial modalities to integrate with other trauma-informed approaches.

Keywords: expressive therapies, forced migration, meta-analysis, refugees, trauma

Procedia PDF Downloads 140
25979 Model of Cosserat Continuum Dispersion in a Half-Space with a Scatterer

Authors: Francisco Velez, Juan David Gomez

Abstract:

Dispersion effects on the Scattering for a semicircular canyon in a micropolar continuum are analyzed, by using a computational finite element scheme. The presence of microrotational waves and the dispersive SV waves affects the propagation of elastic waves. Here, a contrast with the classic model is presented, and the dependence with the micropolar parameters is studied.

Keywords: scattering, semicircular canyon, wave dispersion, micropolar medium, FEM modeling

Procedia PDF Downloads 538
25978 Machine Learning Techniques for Estimating Ground Motion Parameters

Authors: Farid Khosravikia, Patricia Clayton

Abstract:

The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.

Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine

Procedia PDF Downloads 119