Search results for: multi regression analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30974

Search results for: multi regression analysis

30374 Reliability Based Analysis of Multi-Lane Reinforced Concrete Slab Bridges

Authors: Ali Mahmoud, Shadi Najjar, Mounir Mabsout, Kassim Tarhini

Abstract:

Empirical expressions for estimating the wheel load distribution and live-load bending moment are typically specified in highway bridge codes such as the AASHTO procedures. The purpose of this paper is to analyze the reliability levels that are inherent in reinforced concrete slab bridges that are designed based on the simplified empirical live load equations in the AASHTO LRFD procedures. To achieve this objective, bridges with multi-lanes (three and four lanes) and different spans are modeled using finite-element analysis (FEA) subjected to HS20 truck loading, tandem loading, and standard lane loading per AASHTO LRFD procedures. The FEA results are compared with the AASHTO LRFD moments in order to quantify the biases that might result from the simplifying assumptions adopted in AASHTO. A reliability analysis is conducted to quantify the reliability index for bridges designed using AASHTO procedures. To reach a consistent level of safety for three- and four-lane bridges, following a previous study restricted to one- and two-lane bridges, the live load factor in the design equation proposed by AASHTO LRFD will be assessed and revised if needed by alternating the live load factor for these lanes. The results will provide structural engineers with more consistent provisions to design concrete slab bridges or evaluate the load-carrying capacity of existing bridges.

Keywords: reliability analysis of concrete bridges, finite element modeling, reliability analysis, reinforced concrete bridge design, load carrying capacity

Procedia PDF Downloads 321
30373 The Effect of Multi-Stakeholder Extension Services towards Crop Choice and Farmer's Income, the Case of the Arc High Value Crop Programme

Authors: Joseph Sello Kau, Elias Mashayamombe, Brian Washington Madinkana, Cynthia Ngwane

Abstract:

This paper presents the results for the statistical (stepwise linear regression and multiple regression) analyses, carried out on a number of crops in order to evaluate how the decision for crop choice affect the level of farm income generated by the farmers participating in the High Value Crop production (referred to as the HVC). The goal of the HVC is to encourage farmers cultivate fruit crops. The farmers received planting material from different extension agencies, together with other complementary packages such as fertilizer, garden tools, water tanks etc. During the surveys, it was discovered that a significant number of farmers were cultivating traditional crops even when their plot sizes were small. Traditional crops are competing for resources with high value crops. The results of the analyses show that farmers cultivating fruit crops, maize and potatoes were generating high income than those cultivating spinach and cabbage. High farm income is associated with plot size, access to social grants and gender. Choice for a crop is influenced by the availability of planting material and the market potential for the crop. Extension agencies providing the planting materials stand a good chance of having farmers follow their directives. As a recommendation, for the farmers to cultivate more of the HVCs, the ARC must intensify provision of fruit trees.

Keywords: farm income, nature of extension services, type of crops cultivated, fruit crops, cabbage, maize, potato and spinach

Procedia PDF Downloads 301
30372 Modelling Agricultural Commodity Price Volatility with Markov-Switching Regression, Single Regime GARCH and Markov-Switching GARCH Models: Empirical Evidence from South Africa

Authors: Yegnanew A. Shiferaw

Abstract:

Background: commodity price volatility originating from excessive commodity price fluctuation has been a global problem especially after the recent financial crises. Volatility is a measure of risk or uncertainty in financial analysis. It plays a vital role in risk management, portfolio management, and pricing equity. Objectives: the core objective of this paper is to examine the relationship between the prices of agricultural commodities with oil price, gas price, coal price and exchange rate (USD/Rand). In addition, the paper tries to fit an appropriate model that best describes the log return price volatility and estimate Value-at-Risk and expected shortfall. Data and methods: the data used in this study are the daily returns of agricultural commodity prices from 02 January 2007 to 31st October 2016. The data sets consists of the daily returns of agricultural commodity prices namely: white maize, yellow maize, wheat, sunflower, soya, corn, and sorghum. The paper applies the three-state Markov-switching (MS) regression, the standard single-regime GARCH and the two regime Markov-switching GARCH (MS-GARCH) models. Results: to choose the best fit model, the log-likelihood function, Akaike information criterion (AIC), Bayesian information criterion (BIC) and deviance information criterion (DIC) are employed under three distributions for innovations. The results indicate that: (i) the price of agricultural commodities was found to be significantly associated with the price of coal, price of natural gas, price of oil and exchange rate, (ii) for all agricultural commodities except sunflower, k=3 had higher log-likelihood values and lower AIC and BIC values. Thus, the three-state MS regression model outperformed the two-state MS regression model (iii) MS-GARCH(1,1) with generalized error distribution (ged) innovation performs best for white maize and yellow maize; MS-GARCH(1,1) with student-t distribution (std) innovation performs better for sorghum; MS-gjrGARCH(1,1) with ged innovation performs better for wheat, sunflower and soya and MS-GARCH(1,1) with std innovation performs better for corn. In conclusion, this paper provided a practical guide for modelling agricultural commodity prices by MS regression and MS-GARCH processes. This paper can be good as a reference when facing modelling agricultural commodity price problems.

Keywords: commodity prices, MS-GARCH model, MS regression model, South Africa, volatility

Procedia PDF Downloads 188
30371 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 50
30370 Multi-Criteria Nautical Ports Capacity and Services Planning

Authors: N. Perko, N. Kavran, M. Bukljas, I. Berbic

Abstract:

This paper is a result of implemented research on proposed introduced methodology for nautical ports capacity planning by introducing a multi-criteria approach of defined criteria and impacts at the Adriatic Sea. The purpose was analysing the determinants -characteristics of infrastructure and services of nautical ports capacity allocated, especially nowadays due to COVID-19 pandemic, as crucial for the successful operation of nautical ports. Giving the importance of the defined priorities for short-term and long-term planning is essential not only in terms of the development of nautical tourism but also in terms of developing the maritime system, but unfortunately, this is not always carried out. Evaluation of the use of resources should follow from a detailed analysis of all aspects of resources bearing in mind that nautical tourism used resources in a sustainable manner and generate effects in the tourism and maritime sectors. Consequently, the identified multiplier effect of nautical tourism, which should be defined and quantified in detail, should be one of the major competitive products on the Croatian Adriatic and the Mediterranean. Research of nautical tourism is necessary to quantify the effects and required planning system development. In the future, the greatest threat to the long-term sustainable development of nautical tourism can be its further uncontrolled or unlimited and undirected development, especially under pressure markedly higher demand than supply for new moorings in the Mediterranean. Results of this implemented research are applicable to nautical ports management and decision-makers of maritime transport system development. This paper will present implemented research and obtained result-developed methodology for nautical port capacity planning -port capacity planning multi-criteria decision-making. A proposed methodological approach of multi-criteria capacity planning includes four criteria (spatial - transport, cost - infrastructure, ecological and organizational criteria, and additional services). The importance of the criteria and sub-criteria is evaluated and carried out as the basis for sensitivity analysis of the importance of the criteria and sub-criteria. Based on the analysis of the identified and quantified importance of certain criteria and sub-criteria, as well as sensitivity analysis and analysis of changes of the quantified importance, scientific and applicable results will be presented. These obtained results have practical applicability by management of nautical ports in the planning of increasing capacity and further development and for the adaptation of existing nautical ports. Obtained research is applicable and replicable in other seas, and results are especially important and useful in this COVID-19 pandemic challenging maritime development framework.

Keywords: Adriatic Sea, capacity, infrastructures, maritime system, methodology, nautical ports, nautical tourism, service

Procedia PDF Downloads 168
30369 Effect of Leadership Style on Organizational Performance

Authors: Khadija Mushtaq, Mian Saqib Mehmood

Abstract:

This paper attempts to determine the impact of leadership style and learning orientation on organizational performance in Pakistan. A sample of 158 middle managers selected from sports and surgical factories from Sialkot. The empirical estimation is based on a multiple linear regression analysis of the relationship between leadership style, learning orientation and organizational performance. Leadership style is measure through transformational leadership and transactional leadership. The transformational leadership has insignificant impact on organizational performance. The transactional leadership has positive and significant relation with organizational performance. Learning orientation also has positive and significant relation with organizational performance. Linear regression used to estimate the relation between dependent and independent variables. This study suggests top manger should prefer continuous process for improvement for any change in system rather radical change.

Keywords: transformational leadership, transactional leadership, learning orientation, organizational performance, Pakistan

Procedia PDF Downloads 386
30368 3D Printed Multi-Modal Phantom Using Computed Tomography and 3D X-Ray Images

Authors: Sung-Suk Oh, Bong-Keun Kang, Sang-Wook Park, Hui-Jin Joo, Jong-Ryul Choi, Seong-Jun Lee, Jeong-Woo Sohn

Abstract:

The imaging phantom is utilized for the verification, evaluation and tuning of the medical imaging device and system. Although it could be costly, 3D printing is an ideal technique for a rapid, customized, multi-modal phantom making. In this article, we propose the multi-modal phantom using 3D printing. First of all, the Dicom images for were measured by CT (Computed Tomography) and 3D X-ray systems (PET/CT and Angio X-ray system of Siemens) and then were analyzed. Finally, the 3D modeling was processed using Dicom images. The 3D printed phantom was scanned by PET/CT and MRI systems and then evaluated.

Keywords: imaging phantom, MRI (Magnetic Resonance Imaging), PET / CT (Positron Emission Tomography / Computed Tomography), 3D printing

Procedia PDF Downloads 567
30367 Interactive Solutions for the Multi-Objective Capacitated Transportation Problem with Mixed Constraints under Fuzziness

Authors: Aquil Ahmed, Srikant Gupta, Irfan Ali

Abstract:

In this paper, we study a multi-objective capacitated transportation problem (MOCTP) with mixed constraints. This paper is comprised of the modelling and optimisation of an MOCTP in a fuzzy environment in which some goals are fractional and some are linear. In real life application of the fuzzy goal programming (FGP) problem with multiple objectives, it is difficult for the decision maker(s) to determine the goal value of each objective precisely as the goal values are imprecise or uncertain. Also, we developed the concept of linearization of fractional goal for solving the MOCTP. In this paper, imprecision of the parameter is handled by the concept of fuzzy set theory by considering these parameters as a trapezoidal fuzzy number. α-cut approach is used to get the crisp value of the parameters. Numerical examples are used to illustrate the method for solving MOCTP.

Keywords: capacitated transportation problem, multi objective linear programming, multi-objective fractional programming, fuzzy goal programming, fuzzy sets, trapezoidal fuzzy number

Procedia PDF Downloads 420
30366 Multi-Granularity Feature Extraction and Optimization for Pathological Speech Intelligibility Evaluation

Authors: Chunying Fang, Haifeng Li, Lin Ma, Mancai Zhang

Abstract:

Speech intelligibility assessment is an important measure to evaluate the functional outcomes of surgical and non-surgical treatment, speech therapy and rehabilitation. The assessment of pathological speech plays an important role in assisting the experts. Pathological speech usually is non-stationary and mutational, in this paper, we describe a multi-granularity combined feature schemes, and which is optimized by hierarchical visual method. First of all, the difference granularity level pathological features are extracted which are BAFS (Basic acoustics feature set), local spectral characteristics MSCC (Mel s-transform cepstrum coefficients) and nonlinear dynamic characteristics based on chaotic analysis. Latterly, radar chart and F-score are proposed to optimize the features by the hierarchical visual fusion. The feature set could be optimized from 526 to 96-dimensions.The experimental results denote that new features by support vector machine (SVM) has the best performance, with a recognition rate of 84.4% on NKI-CCRT corpus. The proposed method is thus approved to be effective and reliable for pathological speech intelligibility evaluation.

Keywords: pathological speech, multi-granularity feature, MSCC (Mel s-transform cepstrum coefficients), F-score, radar chart

Procedia PDF Downloads 265
30365 Age Estimation from Upper Anterior Teeth by Pulp/Tooth Ratio Using Peri-Apical X-Rays among Egyptians

Authors: Fatma Mohamed Magdy Badr El Dine, Amr Mohamed Abd Allah

Abstract:

Introduction: Age estimation of individuals is one of the crucial steps in forensic practice. Different traditional methods rely on the length of the diaphysis of long bones of limbs, epiphyseal-diaphyseal union, fusion of the primary ossification centers as well as dental eruption. However, there is a growing need for the development of precise and reliable methods to estimate age, especially in cases where dismembered corpses, burnt bodies, purified or fragmented parts are recovered. Teeth are the hardest and indestructible structure in the human body. In recent years, assessment of pulp/tooth area ratio, as an indirect quantification of secondary dentine deposition has received a considerable attention. However, scanty work has been done in Egypt in terms of applicability of pulp/tooth ratio for age estimation. Aim of the Work: The present work was designed to assess the Cameriere’s method for age estimation from pulp/tooth ratio of maxillary canines, central and lateral incisors among a sample from Egyptian population. In addition, to formulate regression equations to be used as population-based standards for age determination. Material and Methods: The present study was conducted on 270 peri-apical X-rays of maxillary canines, central and lateral incisors (collected from 131 males and 139 females aged between 19 and 52 years). The pulp and tooth areas were measured using the Adobe Photoshop software program and the pulp/tooth area ratio was computed. Linear regression equations were determined separately for canines, central and lateral incisors. Results: A significant correlation was recorded between the pulp/tooth area ratio and the chronological age. The linear regression analysis revealed a coefficient of determination (R² = 0.824 for canine, 0.588 for central incisor and 0.737 for lateral incisor teeth). Three regression equations were derived. Conclusion: As a conclusion, the pulp/tooth ratio is a useful technique for estimating age among Egyptians. Additionally, the regression equation derived from canines gave better result than the incisors.

Keywords: age determination, canines, central incisors, Egypt, lateral incisors, pulp/tooth ratio

Procedia PDF Downloads 171
30364 Robust Single/Multi bit Memristor Based Memory

Authors: Ahmed Emara, Maged Ghoneima, Mohamed Dessouky

Abstract:

Demand for low power fast memories is increasing with the increase in IC’s complexity, in this paper we introduce a proposal for a compact SRAM based on memristor devices. The compact size of the proposed cell (1T2M compared to 6T of traditional SRAMs) allows denser memories on the same area. In this paper, we will discuss the proposed memristor memory cell for single/multi bit data storing configurations along with the writing and reading operations. Stored data stability across successive read operation will be illustrated, operational simulation results and a comparison of our proposed design with previously conventional SRAM and previously proposed memristor cells will be provided.

Keywords: memristor, multi-bit, single-bit, circuits, systems

Procedia PDF Downloads 357
30363 The Locus of Action - Tinted Windows

Authors: Devleminck Steven, Debackere Boris

Abstract:

This research is about the ways artists and scientists deal with (and endure) new meaning and comprehend and construct the world. The project reflects on the intense connection between comprehension and construction and their place of creation – the ‘locus of action’. It seeks to define a liquid form of understanding and analysis capable of approaching our complex liquid world as discussed by Zygmunt Bauman. The aim is to establish a multi-viewpoint theoretical approach based on the dynamic concept of the Flâneur as introduced by Baudelaire, replacing single viewpoint categorization. This is coupled with the concept of thickening as proposed by Clifford Geertz with its implication of interaction between multi-layers of meaning. Here walking and looking is introduced as a method or strategy, a model or map, providing a framework of understanding in conditions of hybridity and change.

Keywords: action, art, liquid, locus, negotiation, place, science

Procedia PDF Downloads 266
30362 An Alternative Framework of Multi-Resolution Nested Weighted Essentially Non-Oscillatory Schemes for Solving Euler Equations with Adaptive Order

Authors: Zhenming Wang, Jun Zhu, Yuchen Yang, Ning Zhao

Abstract:

In the present paper, an alternative framework is proposed to construct a class of finite difference multi-resolution nested weighted essentially non-oscillatory (WENO) schemes with an increasingly higher order of accuracy for solving inviscid Euler equations. These WENO schemes firstly obtain a set of reconstruction polynomials by a hierarchy of nested central spatial stencils, and then recursively achieve a higher order approximation through the lower-order precision WENO schemes. The linear weights of such WENO schemes can be set as any positive numbers with a requirement that their sum equals one and they will not pollute the optimal order of accuracy in smooth regions and could simultaneously suppress spurious oscillations near discontinuities. Numerical results obtained indicate that these alternative finite-difference multi-resolution nested WENO schemes with different accuracies are very robust with low dissipation and use as few reconstruction stencils as possible while maintaining the same efficiency, achieving the high-resolution property without any equivalent multi-resolution representation. Besides, its finite volume form is easier to implement in unstructured grids.

Keywords: finite-difference, WENO schemes, high order, inviscid Euler equations, multi-resolution

Procedia PDF Downloads 128
30361 An Alternative Approach for Assessing the Impact of Cutting Conditions on Surface Roughness Using Single Decision Tree

Authors: S. Ghorbani, N. I. Polushin

Abstract:

In this study, an approach to identify factors affecting on surface roughness in a machining process is presented. This study is based on 81 data about surface roughness over a wide range of cutting tools (conventional, cutting tool with holes, cutting tool with composite material), workpiece materials (AISI 1045 Steel, AA2024 aluminum alloy, A48-class30 gray cast iron), spindle speed (630-1000 rpm), feed rate (0.05-0.075 mm/rev), depth of cut (0.05-0.15 mm) and tool overhang (41-65 mm). A single decision tree (SDT) analysis was done to identify factors for predicting a model of surface roughness, and the CART algorithm was employed for building and evaluating regression tree. Results show that a single decision tree is better than traditional regression models with higher rate and forecast accuracy and strong value.

Keywords: cutting condition, surface roughness, decision tree, CART algorithm

Procedia PDF Downloads 358
30360 Urban Landscape Composition and Configuration Dynamics and Expansion of Hawassa City Analysis, Ethiopia Using Satellite Images and Spatial Metrics Approach

Authors: Berhanu Keno Terfa

Abstract:

To understand the consequences of urbanization, accurate, and long-term representation of urban dynamics is essential. Remote sensing data from various multi-temporal satellite images viz., TM (1987), TM (1995), ETM+ (2005) and OLI (2017) were used. An integrated method, landscape metrics, built-up density, and urban growth type analysis were employed to analyze the pattern, process, and overall growth status in the city. The result showed that the built-up area had increased by 541.3% between 1987 and 2017, at an average annual increment of 8.9%. The area of urban expansion in a city has tripled during the 2005-2017 period as compared to 187- 1995. The major growth took place in the east and southeast directions during 1987–1995 period, whereas predominant built-up development was observed in south and southeast direction during 1995–2017 period. The analysis using landscape metrics and urban typologies showed that Hawassa experienced a fragmented and irregular spatiotemporal urban growth patterns, mostly by extension, suggesting a strong tendency towards sprawl in the past three decades.

Keywords: Hawassa, spatial patterns, remote sensing, multi-temporal, urban sprawl

Procedia PDF Downloads 129
30359 Monitoring Blood Pressure Using Regression Techniques

Authors: Qasem Qananwah, Ahmad Dagamseh, Hiam AlQuran, Khalid Shaker Ibrahim

Abstract:

Blood pressure helps the physicians greatly to have a deep insight into the cardiovascular system. The determination of individual blood pressure is a standard clinical procedure considered for cardiovascular system problems. The conventional techniques to measure blood pressure (e.g. cuff method) allows a limited number of readings for a certain period (e.g. every 5-10 minutes). Additionally, these systems cause turbulence to blood flow; impeding continuous blood pressure monitoring, especially in emergency cases or critically ill persons. In this paper, the most important statistical features in the photoplethysmogram (PPG) signals were extracted to estimate the blood pressure noninvasively. PPG signals from more than 40 subjects were measured and analyzed and 12 features were extracted. The features were fed to principal component analysis (PCA) to find the most important independent features that have the highest correlation with blood pressure. The results show that the stiffness index means and standard deviation for the beat-to-beat heart rate were the most important features. A model representing both features for Systolic Blood Pressure (SBP) and Diastolic Blood Pressure (DBP) was obtained using a statistical regression technique. Surface fitting is used to best fit the series of data and the results show that the error value in estimating the SBP is 4.95% and in estimating the DBP is 3.99%.

Keywords: blood pressure, noninvasive optical system, principal component analysis, PCA, continuous monitoring

Procedia PDF Downloads 147
30358 Multi-Criteria Evaluation of Integrated Renewable Energy Systems for Community-Scale Applications

Authors: Kuanrong Qiu, Sebnem Madrali, Evgueniy Entchev

Abstract:

To achieve the satisfactory objectives in deploying integrated renewable energy systems, it is crucial to consider all the related parameters affecting the design and decision-making. The multi-criteria evaluation method is a reliable and efficient tool for achieving the most appropriate solution. The approach considers the influential factors and their relative importance in prioritizing the alternatives. In this paper, a multi-criteria decision framework, based on the criteria including technical, economic, environmental and reliability, is developed to evaluate and prioritize renewable energy technologies and configurations of their integrated systems for community applications, identify their viability, and thus support the adoption of the clean energy technologies and the decision-making regarding energy transitions and transition patterns. Case studies for communities in Canada show that resource availability and the configurations of the integrated systems significantly impact the economic performance and environmental performance.

Keywords: multi-criteria, renewables, integrated energy systems, decision-making, model

Procedia PDF Downloads 72
30357 Empirical Investigations on Speed Differentiations of Traffic Flow: A Case Study on a Basic Freeway Segment of O-2 in Istanbul

Authors: Hamed Rashid Sarand, Kemal Selçuk Öğüt

Abstract:

Speed is one of the fundamental variables of road traffic flow that stands as an important evaluation criterion for traffic analyses in several aspects. In particular, varieties of speed variable, such as average speed, free flow speed, optimum speed (capacity speed), acceleration/deceleration speed and so on, have been explicitly considered in the analysis of not only road safety but also road capacity. In the purpose of realizing 'road speed – maximum speed difference across lanes' and 'road flow rate – maximum speed difference across lanes' relations on freeway traffic, this study presents a case study conducted on a basic freeway segment of O-2 in Istanbul. The traffic data employed in this study have been obtained from 5 remote traffic microwave sensors operated by Istanbul Metropolitan Municipality. The study stretch is located between two successive freeway interchanges: Ümraniye and Kavacık. Daily traffic data of 4 years (2011-2014) summer months, July and August are used. The speed data are analyzed into two main flow areas such as uncongested and congested flows. In this study, the regression analyses were carried out in order to examine the relationship between maximum speed difference across lanes and road speed. These investigations were implemented at uncongested and congested flows, separately. Moreover, the relationship between maximum speed difference across lanes and road flow rate were evaluated by applying regression analyses for both uncongested and congested flows separately. It is concluded that there is the moderate relationship between maximum speed difference across lanes and road speed in 50% cases. Additionally, it is indicated that there is the moderate relationship between maximum speed difference across lanes and road flow rate in 30% cases. The maximum speed difference across lanes decreases as the road flow rate increases.

Keywords: maximum speed difference, regression analysis, remote traffic microwave sensor, speed differentiation, traffic flow

Procedia PDF Downloads 347
30356 Determinants of Poverty: A Logit Regression Analysis of Zakat Applicants

Authors: Zunaidah Ab Hasan, Azhana Othman, Abd Halim Mohd Noor, Nor Shahrina Mohd Rafien

Abstract:

Zakat is a portion of wealth contributed from financially able Muslims to be distributed to predetermine recipients; main among them are the poor and the needy. Distribution of the zakat fund is given with the objective to lift the recipients from poverty. Due to the multidimensional and multifaceted nature of poverty, it is imperative that the causes of poverty are properly identified for assistance given by zakat authorities reached the intended target. Despite, various studies undertaken to identify the poor correctly, there are reports of the poor not receiving the adequate assistance required from zakat. Thus, this study examines the determinants of poverty among applicants for zakat assistance distributed by the State Islamic Religious Council in Malacca (SIRCM). Malacca is a state in Malaysia. The respondents were based on the list of names of new zakat applicants for the month of April and May 2014 provided by SIRCM. A binary logistic regression was estimated based on this data with either zakat applications is rejected or accepted as the dependent variable and set of demographic variables and health as the explanatory variables. Overall, the logistic model successfully predicted factors of acceptance of zakat applications. Three independent variables namely gender, age; size of households and health significantly explain the likelihood of a successful zakat application. Among others, the finding suggests the importance of focusing on providing education opportunity in helping the poor.

Keywords: logistic regression, zakat distribution, status of zakat applications, poverty, education

Procedia PDF Downloads 325
30355 A Geographic Information System Mapping Method for Creating Improved Satellite Solar Radiation Dataset Over Qatar

Authors: Sachin Jain, Daniel Perez-Astudillo, Dunia A. Bachour, Antonio P. Sanfilippo

Abstract:

The future of solar energy in Qatar is evolving steadily. Hence, high-quality spatial solar radiation data is of the uttermost requirement for any planning and commissioning of solar technology. Generally, two types of solar radiation data are available: satellite data and ground observations. Satellite solar radiation data is developed by the physical and statistical model. Ground data is collected by solar radiation measurement stations. The ground data is of high quality. However, they are limited to distributed point locations with the high cost of installation and maintenance for the ground stations. On the other hand, satellite solar radiation data is continuous and available throughout geographical locations, but they are relatively less accurate than ground data. To utilize the advantage of both data, a product has been developed here which provides spatial continuity and higher accuracy than any of the data alone. The popular satellite databases: National Solar radiation Data Base, NSRDB (PSM V3 model, spatial resolution: 4 km) is chosen here for merging with ground-measured solar radiation measurement in Qatar. The spatial distribution of ground solar radiation measurement stations is comprehensive in Qatar, with a network of 13 ground stations. The monthly average of the daily total Global Horizontal Irradiation (GHI) component from ground and satellite data is used for error analysis. The normalized root means square error (NRMSE) values of 3.31%, 6.53%, and 6.63% for October, November, and December 2019 were observed respectively when comparing in-situ and NSRDB data. The method is based on the Empirical Bayesian Kriging Regression Prediction model available in ArcGIS, ESRI. The workflow of the algorithm is based on the combination of regression and kriging methods. A regression model (OLS, ordinary least square) is fitted between the ground and NSBRD data points. A semi-variogram is fitted into the experimental semi-variogram obtained from the residuals. The kriging residuals obtained after fitting the semi-variogram model were added to NSRBD data predicted values obtained from the regression model to obtain the final predicted values. The NRMSE values obtained after merging are respectively 1.84%, 1.28%, and 1.81% for October, November, and December 2019. One more explanatory variable, that is the ground elevation, has been incorporated in the regression and kriging methods to reduce the error and to provide higher spatial resolution (30 m). The final GHI maps have been created after merging, and NRMSE values of 1.24%, 1.28%, and 1.28% have been observed for October, November, and December 2019, respectively. The proposed merging method has proven as a highly accurate method. An additional method is also proposed here to generate calibrated maps by using regression and kriging model and further to use the calibrated model to generate solar radiation maps from the explanatory variable only when not enough historical ground data is available for long-term analysis. The NRMSE values obtained after the comparison of the calibrated maps with ground data are 5.60% and 5.31% for November and December 2019 month respectively.

Keywords: global horizontal irradiation, GIS, empirical bayesian kriging regression prediction, NSRDB

Procedia PDF Downloads 75
30354 Impact Factor Analysis for Spatially Varying Aerosol Optical Depth in Wuhan Agglomeration

Authors: Wenting Zhang, Shishi Liu, Peihong Fu

Abstract:

As an indicator of air quality and directly related to concentration of ground PM2.5, the spatial-temporal variation and impact factor analysis of Aerosol Optical Depth (AOD) have been a hot spot in air pollution. This paper concerns the non-stationarity and the autocorrelation (with Moran’s I index of 0.75) of the AOD in Wuhan agglomeration (WHA), in central China, uses the geographically weighted regression (GRW) to identify the spatial relationship of AOD and its impact factors. The 3 km AOD product of Moderate Resolution Imaging Spectrometer (MODIS) is used in this study. Beyond the economic-social factor, land use density factors, vegetable cover, and elevation, the landscape metric is also considered as one factor. The results suggest that the GWR model is capable of dealing with spatial varying relationship, with R square, corrected Akaike Information Criterion (AICc) and standard residual better than that of ordinary least square (OLS) model. The results of GWR suggest that the urban developing, forest, landscape metric, and elevation are the major driving factors of AOD. Generally, the higher AOD trends to located in the place with higher urban developing, less forest, and flat area.

Keywords: aerosol optical depth, geographically weighted regression, land use change, Wuhan agglomeration

Procedia PDF Downloads 347
30353 Forecasting Equity Premium Out-of-Sample with Sophisticated Regression Training Techniques

Authors: Jonathan Iworiso

Abstract:

Forecasting the equity premium out-of-sample is a major concern to researchers in finance and emerging markets. The quest for a superior model that can forecast the equity premium with significant economic gains has resulted in several controversies on the choice of variables and suitable techniques among scholars. This research focuses mainly on the application of Regression Training (RT) techniques to forecast monthly equity premium out-of-sample recursively with an expanding window method. A broad category of sophisticated regression models involving model complexity was employed. The RT models include Ridge, Forward-Backward (FOBA) Ridge, Least Absolute Shrinkage and Selection Operator (LASSO), Relaxed LASSO, Elastic Net, and Least Angle Regression were trained and used to forecast the equity premium out-of-sample. In this study, the empirical investigation of the RT models demonstrates significant evidence of equity premium predictability both statistically and economically relative to the benchmark historical average, delivering significant utility gains. They seek to provide meaningful economic information on mean-variance portfolio investment for investors who are timing the market to earn future gains at minimal risk. Thus, the forecasting models appeared to guarantee an investor in a market setting who optimally reallocates a monthly portfolio between equities and risk-free treasury bills using equity premium forecasts at minimal risk.

Keywords: regression training, out-of-sample forecasts, expanding window, statistical predictability, economic significance, utility gains

Procedia PDF Downloads 86
30352 Naïve Bayes: A Classical Approach for the Epileptic Seizures Recognition

Authors: Bhaveek Maini, Sanjay Dhanka, Surita Maini

Abstract:

Electroencephalography (EEG) is used to classify several epileptic seizures worldwide. It is a very crucial task for the neurologist to identify the epileptic seizure with manual EEG analysis, as it takes lots of effort and time. Human error is always at high risk in EEG, as acquiring signals needs manual intervention. Disease diagnosis using machine learning (ML) has continuously been explored since its inception. Moreover, where a large number of datasets have to be analyzed, ML is acting as a boon for doctors. In this research paper, authors proposed two different ML models, i.e., logistic regression (LR) and Naïve Bayes (NB), to predict epileptic seizures based on general parameters. These two techniques are applied to the epileptic seizures recognition dataset, available on the UCI ML repository. The algorithms are implemented on an 80:20 train test ratio (80% for training and 20% for testing), and the performance of the model was validated by 10-fold cross-validation. The proposed study has claimed accuracy of 81.87% and 95.49% for LR and NB, respectively.

Keywords: epileptic seizure recognition, logistic regression, Naïve Bayes, machine learning

Procedia PDF Downloads 43
30351 An Experiential Learning of Ontology-Based Multi-document Summarization by Removal Summarization Techniques

Authors: Pranjali Avinash Yadav-Deshmukh

Abstract:

Remarkable development of the Internet along with the new technological innovation, such as high-speed systems and affordable large storage space have led to a tremendous increase in the amount and accessibility to digital records. For any person, studying of all these data is tremendously time intensive, so there is a great need to access effective multi-document summarization (MDS) systems, which can successfully reduce details found in several records into a short, understandable summary or conclusion. For semantic representation of textual details in ontology area, as a theoretical design, our system provides a significant structure. The stability of using the ontology in fixing multi-document summarization problems in the sector of catastrophe control is finding its recommended design. Saliency ranking is usually allocated to each phrase and phrases are rated according to the ranking, then the top rated phrases are chosen as the conclusion. With regards to the conclusion quality, wide tests on a selection of media announcements are appropriate for “Jammu Kashmir Overflow in 2014” records. Ontology centered multi-document summarization methods using “NLP centered extraction” outshine other baselines. Our participation in recommended component is to implement the details removal methods (NLP) to enhance the results.

Keywords: disaster management, extraction technique, k-means, multi-document summarization, NLP, ontology, sentence extraction

Procedia PDF Downloads 365
30350 NFResNet: Multi-Scale and U-Shaped Networks for Deblurring

Authors: Tanish Mittal, Preyansh Agrawal, Esha Pahwa, Aarya Makwana

Abstract:

Multi-Scale and U-shaped Networks are widely used in various image restoration problems, including deblurring. Keeping in mind the wide range of applications, we present a comparison of these architectures and their effects on image deblurring. We also introduce a new block called as NFResblock. It consists of a Fast Fourier Transformation layer and a series of modified Non-Linear Activation Free Blocks. Based on these architectures and additions, we introduce NFResnet and NFResnet+, which are modified multi-scale and U-Net architectures, respectively. We also use three differ-ent loss functions to train these architectures: Charbonnier Loss, Edge Loss, and Frequency Reconstruction Loss. Extensive experiments on the Deep Video Deblurring dataset, along with ablation studies for each component, have been presented in this paper. The proposed architectures achieve a considerable increase in Peak Signal to Noise (PSNR) ratio and Structural Similarity Index (SSIM) value.

Keywords: multi-scale, Unet, deblurring, FFT, resblock, NAF-block, nfresnet, charbonnier, edge, frequency reconstruction

Procedia PDF Downloads 108
30349 The AI Arena: A Framework for Distributed Multi-Agent Reinforcement Learning

Authors: Edward W. Staley, Corban G. Rivera, Ashley J. Llorens

Abstract:

Advances in reinforcement learning (RL) have resulted in recent breakthroughs in the application of artificial intelligence (AI) across many different domains. An emerging landscape of development environments is making powerful RL techniques more accessible for a growing community of researchers. However, most existing frameworks do not directly address the problem of learning in complex operating environments, such as dense urban settings or defense-related scenarios, that incorporate distributed, heterogeneous teams of agents. To help enable AI research for this important class of applications, we introduce the AI Arena: a scalable framework with flexible abstractions for distributed multi-agent reinforcement learning. The AI Arena extends the OpenAI Gym interface to allow greater flexibility in learning control policies across multiple agents with heterogeneous learning strategies and localized views of the environment. To illustrate the utility of our framework, we present experimental results that demonstrate performance gains due to a distributed multi-agent learning approach over commonly-used RL techniques in several different learning environments.

Keywords: reinforcement learning, multi-agent, deep learning, artificial intelligence

Procedia PDF Downloads 143
30348 Development of Generalized Correlation for Liquid Thermal Conductivity of N-Alkane and Olefin

Authors: A. Ishag Mohamed, A. A. Rabah

Abstract:

The objective of this research is to develop a generalized correlation for the prediction of thermal conductivity of n-Alkanes and Alkenes. There is a minority of research and lack of correlation for thermal conductivity of liquids in the open literature. The available experimental data are collected covering the groups of n-Alkanes and Alkenes.The data were assumed to correlate to temperature using Filippov correlation. Nonparametric regression of Grace Algorithm was used to develop the generalized correlation model. A spread sheet program based on Microsoft Excel was used to plot and calculate the value of the coefficients. The results obtained were compared with the data that found in Perry's Chemical Engineering Hand Book. The experimental data correlated to the temperature ranged "between" 273.15 to 673.15 K, with R2 = 0.99.The developed correlation reproduced experimental data that which were not included in regression with absolute average percent deviation (AAPD) of less than 7 %. Thus the spread sheet was quite accurate which produces reliable data.

Keywords: N-Alkanes, N-Alkenes, nonparametric, regression

Procedia PDF Downloads 644
30347 A Regression Analysis Study of the Applicability of Side Scan Sonar based Safety Inspection of Underwater Structures

Authors: Chul Park, Youngseok Kim, Sangsik Choi

Abstract:

This study developed an electric jig for underwater structure inspection in order to solve the problem of the application of side scan sonar to underwater inspection, and analyzed correlations of empirical data in order to enhance sonar data resolution. For the application of tow-typed sonar to underwater structure inspection, an electric jig was developed. In fact, it was difficult to inspect a cross-section at the time of inspection with tow-typed equipment. With the development of the electric jig for underwater structure inspection, it was possible to shorten an inspection time over 20%, compared to conventional tow-typed side scan sonar, and to inspect a proper cross-section through accurate angle control. The indoor test conducted to enhance sonar data resolution proved that a water depth, the distance from an underwater structure, and a filming angle influenced a resolution and data quality. Based on the data accumulated through field experience, multiple regression analysis was conducted on correlations between three variables. As a result, the relational equation of sonar operation according to a water depth was drawn.

Keywords: underwater structure, SONAR, safety inspection, resolution

Procedia PDF Downloads 250
30346 Agile Project Management: A Real Application in a Multi-Project Research and Development Center

Authors: Aysegul Sarac

Abstract:

The aim of this study is to analyze the impacts of integrating agile development principles and practices, in particular to reduce project lead time in a multi-project environment. We analyze Arçelik Washing Machine R&D Center in which multiple projects are conducted by shared resources. In the first part of the study, we illustrate the current waterfall model system by using a value stream map. We define all activities starting from the first idea of the project to the customer and measure process time and lead time of projects. In the second part of the study we estimate potential improvements and select a set of these improvements to integrate agile principles. We aim to develop a future state map and analyze the impacts of integrating lean principles on project lead time. The main contribution of this study is that we analyze and integrate agile product development principles in a real multi-project system.

Keywords: agile project management, multi project system, project lead time, product development

Procedia PDF Downloads 284
30345 An Approach To Flatten The Gain Of Fiber Raman Amplifiers With Multi-Pumping

Authors: Surinder Singh, Adish Bindal

Abstract:

The effects of the pumping wavelength and their power on the gain flattening of a fiber Raman amplifier (FRA) are investigated. The multi-wavelength pumping scheme is utilized to achieve gain flatness in FRA. It is proposed that gain flatness becomes better with increase in number of pumping wavelengths applied. We have achieved flat gain with 0.27 dB fluctuation in a spectral range of 1475-1600 nm for a Raman fiber length of 10 km by using six pumps with wavelengths with in the 1385-1495 nm interval. The effect of multi-wavelength pumping scheme on gain saturation in FRA is also studied. It is proposed that gain saturation condition gets improved by using this scheme and this scheme is more useful for higher spans of Raman fiber length.

Keywords: FRA, WDM, pumping, flat gain

Procedia PDF Downloads 461