Search results for: noise estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2973

Search results for: noise estimation

693 Optimization-Based Design Improvement of Synchronizer in Transmission System for Efficient Vehicle Performance

Authors: Sanyka Banerjee, Saikat Nandi, P. K. Dan

Abstract:

Synchronizers as an integral part of gearbox is a key element in the transmission system in automotive. The performance of synchronizer affects transmission efficiency and driving comfort. Synchronizing mechanism as a major component of transmission system must be capable of preventing vibration and noise in the gears. Gear shifting efficiency improvement with an aim to achieve smooth, quick and energy efficient power transmission remains a challenge for the automotive industry. Performance of the synchronizer is dependent on the features and characteristics of its sub-components and therefore analysis of the contribution of such characteristics is necessary. An important exercise involved is to identify all such characteristics or factors which are associated with the modeling and analysis and for this purpose the literature was reviewed, rather extensively, to study the mathematical models, formulated considering such. It has been observed that certain factors are rather common across models; however, there are few factors which have specifically been selected for individual models, as reported. In order to obtain a more realistic model, an attempt here has been made to identify and assimilate practically all possible factors which may be considered in formulating the model more comprehensively. A simulation study, formulated as a block model, for such analysis has been carried out in a reliable environment like MATLAB. Lower synchronization time is desirable and hence, it has been considered here as the output factors in the simulation modeling for evaluating transmission efficiency. An improved synchronizer model requires optimized values of sub-component design parameters. A parametric optimization utilizing Taguchi’s design of experiment based response data and their analysis has been carried out for this purpose. The effectiveness of the optimized parameters for the improved synchronizer performance has been validated by the simulation study of the synchronizer block model with improved parameter values as input parameters for better transmission efficiency and driver comfort.

Keywords: design of experiments, modeling, parametric optimization, simulation, synchronizer

Procedia PDF Downloads 311
692 Estimation of Effective Radiation Dose Following Computed Tomography Urography at Aminu Kano Teaching Hospital, Kano Nigeria

Authors: Idris Garba, Aisha Rabiu Abdullahi, Mansur Yahuza, Akintade Dare

Abstract:

Background: CT urography (CTU) is efficient radiological examination for the evaluation of the urinary system disorders. However, patients are exposed to a significant radiation dose which is in a way associated with increased cancer risks. Objectives: To determine Computed Tomography Dose Index following CTU, and to evaluate organs equivalent doses. Materials and Methods: A prospective cohort study was carried at a tertiary institution located in Kano northwestern. Ethical clearance was sought and obtained from the research ethics board of the institution. Demographic, scan parameters and CT radiation dose data were obtained from patients that had CTU procedure. Effective dose, organ equivalent doses, and cancer risks were estimated using SPSS statistical software version 16 and CT dose calculator software. Result: A total of 56 patients were included in the study, consisting of 29 males and 27 females. The common indication for CTU examination was found to be renal cyst seen commonly among young adults (15-44yrs). CT radiation dose values in DLP, CTDI and effective dose for CTU were 2320 mGy cm, CTDIw 9.67 mGy and 35.04 mSv respectively. The probability of cancer risks was estimated to be 600 per a million CTU examinations. Conclusion: In this study, the radiation dose for CTU is considered significantly high, with increase in cancer risks probability. Wide radiation dose variations between patient doses suggest that optimization is not fulfilled yet. Patient radiation dose estimate should be taken into consideration when imaging protocols are established for CT urography.

Keywords: CT urography, cancer risks, effective dose, radiation exposure

Procedia PDF Downloads 345
691 Evaluating Traffic Congestion Using the Bayesian Dirichlet Process Mixture of Generalized Linear Models

Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig

Abstract:

This study applied traffic speed and occupancy to develop clustering models that identify different traffic conditions. Particularly, these models are based on the Dirichlet Process Mixture of Generalized Linear regression (DML) and change-point regression (CR). The model frameworks were implemented using 2015 historical traffic data aggregated at a 15-minute interval from an Interstate 295 freeway in Jacksonville, Florida. Using the deviance information criterion (DIC) to identify the appropriate number of mixture components, three traffic states were identified as free-flow, transitional, and congested condition. Results of the DML revealed that traffic occupancy is statistically significant in influencing the reduction of traffic speed in each of the identified states. Influence on the free-flow and the congested state was estimated to be higher than the transitional flow condition in both evening and morning peak periods. Estimation of the critical speed threshold using CR revealed that 47 mph and 48 mph are speed thresholds for congested and transitional traffic condition during the morning peak hours and evening peak hours, respectively. Free-flow speed thresholds for morning and evening peak hours were estimated at 64 mph and 66 mph, respectively. The proposed approaches will facilitate accurate detection and prediction of traffic congestion for developing effective countermeasures.

Keywords: traffic congestion, multistate speed distribution, traffic occupancy, Dirichlet process mixtures of generalized linear model, Bayesian change-point detection

Procedia PDF Downloads 294
690 Analysis of Lift Arm Failure and Its Improvement for the Use in Farm Tractor

Authors: Japinder Wadhawan, Pradeep Rajan, Alok K. Saran, Navdeep S. Sidhu, Daanvir K. Dhir

Abstract:

Currently, research focus in the development of agricultural equipment and tractor parts in India is innovation and use of alternate materials like austempered ductile iron (ADI). Three-point linkage mechanism of the tractor is susceptible to unpredictable load conditions in the field, and one of the critical components vulnerable to failure is lift arm. Conventionally, lift arm is manufactured either by forging or casting (SG Iron) and main objective of the present work is to reduce the failure occurrences in the lift arm, which is achieved by changing the manufacturing material, i.e ADI, without changing existing design. Effect of four pertinent variables of manufacturing ADI, viz. austenitizing temperature, austenitizing time, austempering temperature, austempering time, was investigated using Taguchi method for design of experiments. To analyze the effect of parameters on the mechanical properties, mean average and signal-to-noise (S/N) ratio was calculated based on the design of experiments with L9 orthogonal array and the linear graph. The best combination for achieving the desired mechanical properties of lift arm is austenitization at 860°C for 90 minutes and austempering at 350°C for 60 minutes. Results showed that the developed component is having 925 MPA tensile strength, 7.8 per cent elongation and 120 joules toughness making it more suitable material for lift arm manufacturing. The confirmatory experiment has been performed and found a good agreement between predicted and experimental value. Also, the CAD model of the existing design was developed in computer aided design software, and structural loading calculations were performed by a commercial finite element analysis package. An optimized shape of the lift arm has also been proposed resulting in light weight and cheaper product than the existing design, which can withstand the same loading conditions effectively.

Keywords: austempered ductile iron, design of experiment, finite element analysis, lift arm

Procedia PDF Downloads 233
689 Wood Energy in Bangladesh: An Overview of Status, Challenges and Development

Authors: Md. Kamrul Hassan, Ari Pappinen

Abstract:

Wood energy is the single most important form of renewable energy in many parts of the world especially in the least developing countries in South Asia like Bangladesh. The last portion of the national population of this country depends on wood energy for their daily primary energy need. This paper deals with the estimation of wood fuel at the current level and identifies the challenges and strategies related to the development of this resource. Desk research, interactive research and field survey were conducted for gathering and analyzing of data for this study. The study revealed that wood fuel plays a significant role in total primary energy supply in Bangladesh, and the contribution of wood fuel in final energy consumption in 2013 was about 24%. Trees on homestead areas, secondary plantation on off forest lands, and forests are the main sources of supplying wood fuel in the country. Insufficient supply of wood fuel against high upward demand is the main cause of concern for sustainable consumption, which eventually leads deterioration and depletion of the resources. Inadequate afforestation programme, lack of initiatives towards the utilization of set-aside lands for wood energy plantations, and inefficient management of the existing resources have been identified as the major impediments to the development of wood energy in Bangladesh. The study argued that enhancement of public-private-partnership afforestation programmes, intensifying the waste and marginal lands with short-rotation tree species, and formulation of biomass-based rural energy strategies at the regional level are relevant to the promotion of sustainable wood energy in the country.

Keywords: Bangladesh, challenge, supply, wood energy

Procedia PDF Downloads 188
688 Policy Effectiveness in the Situation of Economic Recession

Authors: S. K. Ashiquer Rahman

Abstract:

The proper policy handling might not able to attain the target since some of recessions, e.g., pandemic-led crises, the variables shocks of the economics. At the level of this situation, the Central bank implements the monetary policy to choose increase the exogenous expenditure and level of money supply consecutively for booster level economic growth, whether the monetary policy is relatively more effective than fiscal policy in altering real output growth of a country or both stand for relatively effective in the direction of output growth of a country. The dispute with reference to the relationship between the monetary policy and fiscal policy is centered on the inflationary penalty of the shortfall financing by the fiscal authority. The latest variables socks of economics as well as the pandemic-led crises, central banks around the world predicted just about a general dilemma in relation to increase rates to face the or decrease rates to sustain the economic movement. Whether the prices hang about fundamentally unaffected, the aggregate demand has also been hold a significantly negative attitude by the outbreak COVID-19 pandemic. To empirically investigate the effects of economics shocks associated COVID-19 pandemic, the paper considers the effectiveness of the monetary policy and fiscal policy that linked to the adjustment mechanism of different economic variables. To examine the effects of economics shock associated COVID-19 pandemic towards the effectiveness of Monetary Policy and Fiscal Policy in the direction of output growth of a Country, this paper uses the Simultaneous equations model under the estimation of Two-Stage Least Squares (2SLS) and Ordinary Least Squares (OLS) Method.

Keywords: IS-LM framework, pandemic. Economics variables shocks, simultaneous equations model, output growth

Procedia PDF Downloads 95
687 On the Accuracy of Basic Modal Displacement Method Considering Various Earthquakes

Authors: Seyed Sadegh Naseralavi, Sadegh Balaghi, Ehsan Khojastehfar

Abstract:

Time history seismic analysis is supposed to be the most accurate method to predict the seismic demand of structures. On the other hand, the required computational time of this method toward achieving the result is its main deficiency. While being applied in optimization process, in which the structure must be analyzed thousands of time, reducing the required computational time of seismic analysis of structures makes the optimization algorithms more practical. Apparently, the invented approximate methods produce some amount of errors in comparison with exact time history analysis but the recently proposed method namely, Complete Quadratic Combination (CQC) and Sum Root of the Sum of Squares (SRSS) drastically reduces the computational time by combination of peak responses in each mode. In the present research, the Basic Modal Displacement (BMD) method is introduced and applied towards estimation of seismic demand of main structure. Seismic demand of sampled structure is estimated by calculation of modal displacement of basic structure (in which the modal displacement has been calculated). Shear steel sampled structures are selected as case studies. The error applying the introduced method is calculated by comparison of the estimated seismic demands with exact time history dynamic analysis. The efficiency of the proposed method is demonstrated by application of three types of earthquakes (in view of time of peak ground acceleration).

Keywords: time history dynamic analysis, basic modal displacement, earthquake-induced demands, shear steel structures

Procedia PDF Downloads 355
686 Early Evaluation of Long-Span Suspension Bridges Using Smartphone Accelerometers

Authors: Ekin Ozer, Maria Q. Feng, Rupa Purasinghe

Abstract:

Structural deterioration of bridge systems possesses an ongoing threat to the transportation networks. Besides, landmark bridges’ integrity and safety are more than sole functionality, since they provide a strong presence for the society and nations. Therefore, an innovative and sustainable method to inspect landmark bridges is essential to ensure their resiliency in the long run. In this paper, a recently introduced concept, smartphone-based modal frequency estimation is addressed, and this paper targets to authenticate the fidelity of smartphone-based vibration measurements gathered from three landmark suspension bridges. Firstly, smartphones located at the bridge mid-span are adopted as portable and standalone vibration measurement devices. Then, their embedded accelerometers are utilized to gather vibration response under operational loads, and eventually frequency domain characteristics are deduced. The preliminary analysis results are compared with the reference publications and high-quality monitoring data to validate the usability of smartphones on long-span landmark suspension bridges. If the technical challenges such as high period of vibration, low amplitude excitation, embedded smartphone sensor features, sampling, and citizen engagement are tackled, smartphones can provide a novel and cost-free crowdsourcing tool for maintenance of these landmark structures. This study presents the early phase findings from three signature structures located in the United States.

Keywords: smart and mobile sensing, structural health monitoring, suspension bridges, vibration analysis

Procedia PDF Downloads 292
685 Possible Number of Dwelling Units Using Waste Plastic Bottle for Construction

Authors: Dibya Jivan Pati, Kazuhisa Iki, Riken Homma

Abstract:

Unlike other metro cities of India, Bhubaneswar–the capital city of Odisha, is expected to reach 1-million-mark population by now. The demands of dwelling unit requirement mostly among urban poor belonging to Economically Weaker section (EWS) and Low Income groups (LIG) is becoming a challenge due to high housing cost and rents. As a matter of fact, it’s also noted that, with increase in population, the solid waste generation also increases subsequently affecting the environment due to inefficiency in collection of waste by local government bodies. Methods of utilizing Solid Waste - especially in form of Plastic bottles, Glass bottles and Metal cans (PGM) are now widely used as an alternative material for construction of low-cost building by Non-Government Organizations (NGOs) in developing countries like India to help the urban poor afford a shelter. The application of disposed plastic bottle used in construction of single dwelling significantly reduces the overall cost of construction to as much as 14% compared to traditional construction material. Therefore, considering its cost-benefit result, it’s possible to provide housing to EWS and LIGs at an affordable price. In this paper, we estimated the quantity of plastic bottles generated in Bhubaneswar which further helped to estimate the possible number of single dwelling unit that can be constructed on yearly basis so as to refrain from further housing shortage. The estimation results will be practically used for planning and managing low-cost housing business by local government and NGOs.

Keywords: construction, dwelling unit, plastic bottle, solid waste generation, groups

Procedia PDF Downloads 475
684 Short-Term versus Long-Term Effect of Waterpipe Smoking Exposure on Cardiovascular Biomarkers in Mice

Authors: Abeer Rababa'h, Ragad Bsoul, Mohammad Alkhatatbeh, Karem Alzoubi

Abstract:

Introduction: Tobacco use is one of the main risk factors to cardiovascular diseases (CVD) and atherosclerosis in particular. WPS contains several toxic materials such as: nicotine, carcinogens, tar, carbon monoxide and heavy metals. Thus, WPS is considered to be as one of the toxic environmental factors that should be investigated intensively. Therefore, the aim of this study is to investigate the effect of WPS on several cardiovascular biological markers that may cause atherosclerosis in mice. The study also conducted to study the temporal effects of WPS on the atherosclerotic biomarkers upon short (2 weeks) and long-term (8 weeks) exposures. Methods: mice were exposed to WPS and heart homogenates were analyzed to elucidate the effects of WPS on matrix metalloproteinase (MMPs), endothelin-1 (ET-1) and, myeloperoxidase (MPO). Following protein estimation, enzyme-linked immunosorbent assays were done to measure the levels of MMPs (isoforms 1, 3, and 9), MPO, and ET-1 protein expressions. Results: our data showed that acute exposure to WPS significantly enhances the levels of MMP-3, MMP- 9, and MPO expressions (p < 0.05) compared to their corresponding control. However, the body was capable to normalize the level of expressions for such parameters following continuous exposure for 8 weeks (p > 0.05). Additionally, we showed that the level of ET-1 expression was significantly higher upon chronic exposure to WPS compared to both control and acute exposure groups (p < 0.05). Conclusion: Waterpipe exposure has a significant negative effect on atherosclerosis and the enhancement of the atherosclerotic biomarkers expression (MMP-3 and 9, MPO, and ET-1) might represent an early scavenger of compensatory efforts to maintain cardiac function after WP exposure.

Keywords: atherosclerotic biomarkers, cardiovascular disease, matrix metalloproteinase, waterpipe

Procedia PDF Downloads 352
683 Assessment of Forage Utilization for Pasture-Based Livestock Production in Udubo Grazing Reserve, Bauchi State

Authors: Mustapha Saidu, Bilyaminu Mohammed

Abstract:

The study was conducted in Udubo Grazing Reserve between July 2019 and October 2019 to assess forage utilization for pasture-based livestock production in reserve. The grazing land was cross-divided into grids, where 15 coordinates were selected as the sample points. Grids of one-kilometer interval were made. The grids were systematically selected 1 grid after 7 grids. 1 × 1-meter quadrat was made at the coordinate of the selected grids for measurement, estimation, and sample collection. The results of the study indicated that Zornia glochidiatah has the highest percent of species composition (42%), while Mitracarpus hirtus has the lowest percent (0.1%). Urochloa mosambicensis has 48 percent of height removed and 27 percent used by weight, Zornia glochidiata 60 percent of height removed and 57 percent used by weight, Alysicapus veginalis has 55 percent of height removed, and 40 percent used by weight, and Cenchrus biflorus has 40 percent of height removed and 28 percent used by weight. The target is 50 percent utilization of forage by weight during a grazing period as well as at the end of the grazing season. The study found that Orochloa mosambicensis, Alysicarpus veginalis, and Cenchrus biflorus had lower percent by weight which is normal, while Zornia glochidiata had a higher percent by weight which is an indication of danger. The study recommends that the identification of key plant species in pasture and rangeland is critical to implementing a successful grazing management plan. There should be collective action and promotion of historically generated grazing knowledge through public and private advocacies.

Keywords: forage, grazing reserve, live stock, pasture, plant species

Procedia PDF Downloads 89
682 Improving Fingerprinting-Based Localization System Using Generative Artificial Intelligence

Authors: Getaneh Berie Tarekegn

Abstract:

A precise localization system is crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. The most common method for providing continuous positioning services in outdoor environments is by using a global navigation satellite system (GNSS). Due to nonline-of-sight, multipath, and weather conditions, GNSS systems do not perform well in dense urban, urban, and suburban areas.This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a novel semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 39 cm, and more than 90% of the errors are less than 82 cm. That is, numerical results proved that, in comparison to traditional methods, the proposed SRCLoc method can significantly improve positioning performance and reduce radio map construction costs.

Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine

Procedia PDF Downloads 71
681 Community Forest Management and Ecological and Economic Sustainability: A Two-Way Street

Authors: Sony Baral, Harald Vacik

Abstract:

This study analyzes the sustainability of community forest management in two community forests in Terai and Hills of Nepal, representing four forest types: 1) Shorearobusta, 2) Terai hardwood, 3) Schima-Castanopsis, and 4) other Hills. The sustainability goals for this region include maintaining and enhancing the forest stocks. Considering this, we analysed changes in species composition, stand density, growing stock volume, and growth-to-removal ratio at 3-5 year intervals from 2005-2016 within 109 permanent forest plots (57 in the Terai and 52 in the Hills). To complement inventory data, forest users, forest committee members, and forest officials were consulted. The results indicate that the relative representation of economically valuable tree species has increased. Based on trends in stand density, both forests are being sustainably managed. Pole-sized trees dominated the diameter distribution, however, with a limited number of mature trees and declined regeneration. The forests were over-harvested until 2013 but under-harvested in the recent period in the Hills. In contrast, both forest types were under-harvested throughout the inventory period in the Terai. We found that the ecological dimension of sustainable forest management is strongly achieved while the economic dimension is lacking behind the current potential. Thus, we conclude that maintaining a large number of trees in the forest does not necessarily ensure both ecological and economical sustainability. Instead, priority should be given on a rational estimation of the annual harvest rates to enhance forest resource conditions together with regular benefits to the local communities.

Keywords: community forests, diversity, growing stock, forest management, sustainability, nepal

Procedia PDF Downloads 97
680 Study on the Process of Detumbling Space Target by Laser

Authors: Zhang Pinliang, Chen Chuan, Song Guangming, Wu Qiang, Gong Zizheng, Li Ming

Abstract:

The active removal of space debris and asteroid defense are important issues in human space activities. Both of them need a detumbling process, for almost all space debris and asteroid are in a rotating state, and it`s hard and dangerous to capture or remove a target with a relatively high tumbling rate. So it`s necessary to find a method to reduce the angular rate first. The laser ablation method is an efficient way to tackle this detumbling problem, for it`s a contactless technique and can work at a safe distance. In existing research, a laser rotational control strategy based on the estimation of the instantaneous angular velocity of the target has been presented. But their calculation of control torque produced by a laser, which is very important in detumbling operation, is not accurate enough, for the method they used is only suitable for the plane or regularly shaped target, and they did not consider the influence of irregular shape and the size of the spot. In this paper, based on the triangulation reconstruction of the target surface, we propose a new method to calculate the impulse of the irregularly shaped target under both the covered irradiation and spot irradiation of the laser and verify its accuracy by theoretical formula calculation and impulse measurement experiment. Then we use it to study the process of detumbling cylinder and asteroid by laser. The result shows that the new method is universally practical and has high precision; it will take more than 13.9 hours to stop the rotation of Bennu with 1E+05kJ laser pulse energy; the speed of the detumbling process depends on the distance between the spot and the centroid of the target, which can be found an optimal value in every particular case.

Keywords: detumbling, laser ablation drive, space target, space debris remove

Procedia PDF Downloads 84
679 Nonparametric Truncated Spline Regression Model on the Data of Human Development Index in Indonesia

Authors: Kornelius Ronald Demu, Dewi Retno Sari Saputro, Purnami Widyaningsih

Abstract:

Human Development Index (HDI) is a standard measurement for a country's human development. Several factors may have influenced it, such as life expectancy, gross domestic product (GDP) based on the province's annual expenditure, the number of poor people, and the percentage of an illiterate people. The scatter plot between HDI and the influenced factors show that the plot does not follow a specific pattern or form. Therefore, the HDI's data in Indonesia can be applied with a nonparametric regression model. The estimation of the regression curve in the nonparametric regression model is flexible because it follows the shape of the data pattern. One of the nonparametric regression's method is a truncated spline. Truncated spline regression is one of the nonparametric approach, which is a modification of the segmented polynomial functions. The estimator of a truncated spline regression model was affected by the selection of the optimal knots point. Knot points is a focus point of spline truncated functions. The optimal knots point was determined by the minimum value of generalized cross validation (GCV). In this article were applied the data of Human Development Index with a truncated spline nonparametric regression model. The results of this research were obtained the best-truncated spline regression model to the HDI's data in Indonesia with the combination of optimal knots point 5-5-5-4. Life expectancy and the percentage of an illiterate people were the significant factors depend to the HDI in Indonesia. The coefficient of determination is 94.54%. This means the regression model is good enough to applied on the data of HDI in Indonesia.

Keywords: generalized cross validation (GCV), Human Development Index (HDI), knots point, nonparametric regression, truncated spline

Procedia PDF Downloads 339
678 The Effect of Female Access to Healthcare and Educational Attainment on Nigerian Agricultural Productivity Level

Authors: Esther M. Folarin, Evans Osabuohien, Ademola Onabote

Abstract:

Agriculture constitutes an important part of development and poverty mitigation in lower-middle-income countries, like Nigeria. The level of agricultural productivity in the Nigerian economy in line with the level of demand necessary to meet the desired expectation of the Nigerian populace is threatening to meeting the standard of the United Nations (UN) Sustainable Development Goals (SDGs); This includes the SDG-2 (achieve food security through agricultural productivity). The overall objective of the study is to reveal the performance of the interaction variable in the model among other factors that help in the achievement of greater Nigerian agricultural productivity. The study makes use of Wave 4 (2018/2019) of the Living Standard Measurement Studies, Integrated Survey on Agriculture (LSMS-ISA). Qualitative analysis of the information was also used to provide complimentary answers to the quantitative analysis done in the study. The study employed human capital theory and Grossman’s theory of health Demand in explaining the relationships that exist between the variables within the model of the study. The study engages the Instrumental Variable Regression technique in achieving the broad objectives among other techniques for the other specific objectives. The estimation results show that there exists a positive relationship between female healthcare and the level of female agricultural productivity in Nigeria. In conclusion, the study emphasises the need for more provision and empowerment for greater female access to healthcare and educational attainment levels that aids higher female agricultural productivity and consequently an improvement in the total agricultural productivity of the Nigerian economy.

Keywords: agricultural productivity, education, female, healthcare, investment

Procedia PDF Downloads 81
677 Critical Success Factors Quality Requirement Change Management

Authors: Jamshed Ahmad, Abdul Wahid Khan, Javed Ali Khan

Abstract:

Managing software quality requirements change management is a difficult task in the field of software engineering. Avoiding incoming changes result in user dissatisfaction while accommodating to many requirement changes may delay product delivery. Poor requirements management is solely considered the primary cause of the software failure. It becomes more challenging in global software outsourcing. Addressing success factors in quality requirement change management is desired today due to the frequent change requests from the end-users. In this research study, success factors are recognized and scrutinized with the help of a systematic literature review (SLR). In total, 16 success factors were identified, which significantly impacted software quality requirement change management. The findings show that Proper Requirement Change Management, Rapid Delivery, Quality Software Product, Access to Market, Project Management, Skills and Methodologies, Low Cost/Effort Estimation, Clear Plan and Road Map, Agile Processes, Low Labor Cost, User Satisfaction, Communication/Close Coordination, Proper Scheduling and Time Constraints, Frequent Technological Changes, Robust Model, Geographical distribution/Cultural differences are the key factors that influence software quality requirement change. The recognized success factors and validated with the help of various research methods, i.e., case studies, interviews, surveys and experiments. These factors are then scrutinized in continents, database, company size and period of time. Based on these findings, requirement change will be implemented in a better way.

Keywords: global software development, requirement engineering, systematic literature review, success factors

Procedia PDF Downloads 197
676 IoT and Deep Learning approach for Growth Stage Segregation and Harvest Time Prediction of Aquaponic and Vermiponic Swiss Chards

Authors: Praveen Chandramenon, Andrew Gascoyne, Fideline Tchuenbou-Magaia

Abstract:

Aquaponics offers a simple conclusive solution to the food and environmental crisis of the world. This approach combines the idea of Aquaculture (growing fish) to Hydroponics (growing vegetables and plants in a soilless method). Smart Aquaponics explores the use of smart technology including artificial intelligence and IoT, to assist farmers with better decision making and online monitoring and control of the system. Identification of different growth stages of Swiss Chard plants and predicting its harvest time is found to be important in Aquaponic yield management. This paper brings out the comparative analysis of a standard Aquaponics with a Vermiponics (Aquaponics with worms), which was grown in the controlled environment, by implementing IoT and deep learning-based growth stage segregation and harvest time prediction of Swiss Chards before and after applying an optimal freshwater replenishment. Data collection, Growth stage classification and Harvest Time prediction has been performed with and without water replenishment. The paper discusses the experimental design, IoT and sensor communication with architecture, data collection process, image segmentation, various regression and classification models and error estimation used in the project. The paper concludes with the results comparison, including best models that performs growth stage segregation and harvest time prediction of the Aquaponic and Vermiponic testbed with and without freshwater replenishment.

Keywords: aquaponics, deep learning, internet of things, vermiponics

Procedia PDF Downloads 71
675 Improving Fault Tolerance and Load Balancing in Heterogeneous Grid Computing Using Fractal Transform

Authors: Saad M. Darwish, Adel A. El-Zoghabi, Moustafa F. Ashry

Abstract:

The popularity of the Internet and the availability of powerful computers and high-speed networks as low-cost commodity components are changing the way we use computers today. These technical opportunities have led to the possibility of using geographically distributed and multi-owner resources to solve large-scale problems in science, engineering, and commerce. Recent research on these topics has led to the emergence of a new paradigm known as Grid computing. To achieve the promising potentials of tremendous distributed resources, effective and efficient load balancing algorithms are fundamentally important. Unfortunately, load balancing algorithms in traditional parallel and distributed systems, which usually run on homogeneous and dedicated resources, cannot work well in the new circumstances. In this paper, the concept of a fast fractal transform in heterogeneous grid computing based on R-tree and the domain-range entropy is proposed to improve fault tolerance and load balancing algorithm by improve connectivity, communication delay, network bandwidth, resource availability, and resource unpredictability. A novel two-dimension figure of merit is suggested to describe the network effects on load balance and fault tolerance estimation. Fault tolerance is enhanced by adaptively decrease replication time and message cost while load balance is enhanced by adaptively decrease mean job response time. Experimental results show that the proposed method yields superior performance over other methods.

Keywords: Grid computing, load balancing, fault tolerance, R-tree, heterogeneous systems

Procedia PDF Downloads 490
674 Improve Student Performance Prediction Using Majority Vote Ensemble Model for Higher Education

Authors: Wade Ghribi, Abdelmoty M. Ahmed, Ahmed Said Badawy, Belgacem Bouallegue

Abstract:

In higher education institutions, the most pressing priority is to improve student performance and retention. Large volumes of student data are used in Educational Data Mining techniques to find new hidden information from students' learning behavior, particularly to uncover the early symptom of at-risk pupils. On the other hand, data with noise, outliers, and irrelevant information may provide incorrect conclusions. By identifying features of students' data that have the potential to improve performance prediction results, comparing and identifying the most appropriate ensemble learning technique after preprocessing the data, and optimizing the hyperparameters, this paper aims to develop a reliable students' performance prediction model for Higher Education Institutions. Data was gathered from two different systems: a student information system and an e-learning system for undergraduate students in the College of Computer Science of a Saudi Arabian State University. The cases of 4413 students were used in this article. The process includes data collection, data integration, data preprocessing (such as cleaning, normalization, and transformation), feature selection, pattern extraction, and, finally, model optimization and assessment. Random Forest, Bagging, Stacking, Majority Vote, and two types of Boosting techniques, AdaBoost and XGBoost, are ensemble learning approaches, whereas Decision Tree, Support Vector Machine, and Artificial Neural Network are supervised learning techniques. Hyperparameters for ensemble learning systems will be fine-tuned to provide enhanced performance and optimal output. The findings imply that combining features of students' behavior from e-learning and students' information systems using Majority Vote produced better outcomes than the other ensemble techniques.

Keywords: educational data mining, student performance prediction, e-learning, classification, ensemble learning, higher education

Procedia PDF Downloads 107
673 An Estimating Equation for Survival Data with a Possibly Time-Varying Covariates under a Semiparametric Transformation Models

Authors: Yemane Hailu Fissuh, Zhongzhan Zhang

Abstract:

An estimating equation technique is an alternative method of the widely used maximum likelihood methods, which enables us to ease some complexity due to the complex characteristics of time-varying covariates. In the situations, when both the time-varying covariates and left-truncation are considered in the model, the maximum likelihood estimation procedures become much more burdensome and complex. To ease the complexity, in this study, the modified estimating equations those have been given high attention and considerations in many researchers under semiparametric transformation model was proposed. The purpose of this article was to develop the modified estimating equation under flexible and general class of semiparametric transformation models for left-truncated and right censored survival data with time-varying covariates. Besides the commonly applied Cox proportional hazards model, such kind of problems can be also analyzed with a general class of semiparametric transformation models to estimate the effect of treatment given possibly time-varying covariates on the survival time. The consistency and asymptotic properties of the estimators were intuitively derived via the expectation-maximization (EM) algorithm. The characteristics of the estimators in the finite sample performance for the proposed model were illustrated via simulation studies and Stanford heart transplant real data examples. To sum up the study, the bias for covariates has been adjusted by estimating density function for the truncation time variable. Then the effect of possibly time-varying covariates was evaluated in some special semiparametric transformation models.

Keywords: EM algorithm, estimating equation, semiparametric transformation models, time-to-event outcomes, time varying covariate

Procedia PDF Downloads 152
672 Identification of Vehicle Dynamic Parameters by Using Optimized Exciting Trajectory on 3- DOF Parallel Manipulator

Authors: Di Yao, Gunther Prokop, Kay Buttner

Abstract:

Dynamic parameters, including the center of gravity, mass and inertia moments of vehicle, play an essential role in vehicle simulation, collision test and real-time control of vehicle active systems. To identify the important vehicle dynamic parameters, a systematic parameter identification procedure is studied in this work. In the first step of the procedure, a conceptual parallel manipulator (virtual test rig), which possesses three rotational degrees-of-freedom, is firstly proposed. To realize kinematic characteristics of the conceptual parallel manipulator, the kinematic analysis consists of inverse kinematic and singularity architecture is carried out. Based on the Euler's rotation equations for rigid body dynamics, the dynamic model of parallel manipulator and derivation of measurement matrix for parameter identification are presented subsequently. In order to reduce the sensitivity of parameter identification to measurement noise and other unexpected disturbances, a parameter optimization process of searching for optimal exciting trajectory of parallel manipulator is conducted in the following section. For this purpose, the 321-Euler-angles defined by parameterized finite-Fourier-series are primarily used to describe the general exciting trajectory of parallel manipulator. To minimize the condition number of measurement matrix for achieving better parameter identification accuracy, the unknown coefficients of parameterized finite-Fourier-series are estimated by employing an iterative algorithm based on MATLAB®. Meanwhile, the iterative algorithm will ensure the parallel manipulator still keeps in an achievable working status during the execution of optimal exciting trajectory. It is showed that the proposed procedure and methods in this work can effectively identify the vehicle dynamic parameters and could be an important application of parallel manipulator in the fields of parameter identification and test rig development.

Keywords: parameter identification, parallel manipulator, singularity architecture, dynamic modelling, exciting trajectory

Procedia PDF Downloads 265
671 An Analysis of the Impact of Government Budget Deficits on Economic Performance. A Zimbabwean Perspective

Authors: Tafadzwa Shumba, Rose C. Nyatondo, Regret Sunge

Abstract:

This research analyses the impact of budget deficits on the economic performance of Zimbabwe. The study employs the autoregressive distributed lag (ARDL) confines testing method to co-integration and long-run estimation using time series data from 1980-2018. The Augmented Dick Fuller (ADF) and the Granger approach were used to testing for stationarity and causality among the factors. Co-integration test results affirm a long term association between GDP development rate and descriptive factors. Causality test results show a unidirectional connection between budget shortfall to GDP development and bi-directional causality amid debt and budget deficit. This study also found unidirectional causality from debt to GDP growth rate. ARDL estimates indicate a significantly positive long term and significantly negative short term impact of budget shortfall on GDP. This suggests that budget deficits have a short-run growth retarding effect and a long-run growth-inducing effect. The long-run results follow the Keynesian theory that posits that fiscal deficits result in an increase in GDP growth. Short-run outcomes follow the neoclassical theory. In light of these findings, the government is recommended to minimize financing of recurrent expenditure using a budget deficit. To achieve sustainable growth and development, the government needs to spend an absorbable budget deficit focusing on capital projects such as the development of human capital and infrastructure.

Keywords: ARDL, budget deficit, economic performance, long run

Procedia PDF Downloads 97
670 Sustainable Land Use Evaluation Based on Preservative Approach: Neighborhoods of Susa City

Authors: Somaye Khademi, Elahe Zoghi Hoseini, Mostafa Norouzi

Abstract:

Determining the manner of land-use and the spatial structure of cities on the one hand, and the economic value of each piece of land, on the other hand, land-use planning is always considered as the main part of urban planning. In this regard, emphasizing the efficient use of land, the sustainable development approach has presented a new perspective on urban planning and consequently on its most important pillar, i.e. land-use planning. In order to evaluate urban land-use, it has been attempted in this paper to select the most significant indicators affecting urban land-use and matching sustainable development indicators. Due to the significance of preserving ancient monuments and the surroundings as one of the main pillars of achieving sustainability, in this research, sustainability indicators have been selected emphasizing the preservation of ancient monuments and historical observance of the city of Susa as one of the historical cities of Iran. It has also been attempted to integrate these criteria with other land-use sustainability indicators. For this purpose, Kernel Density Estimation (KDE) and the AHP model have been used for providing maps displaying spatial density and combining layers as well as providing final maps respectively. Moreover, the rating of sustainability will be studied in different districts of the city of Shush so as to evaluate the status of land sustainability in different parts of the city. The results of the study show that different neighborhoods of Shush do not have the same sustainability in land-use such that neighborhoods located in the eastern half of the city, i.e. the new neighborhoods, have a higher sustainability than those of the western half. It seems that the allocation of a high percentage of these areas to arid lands and historical areas is one of the main reasons for their sustainability.

Keywords: city of Susa, historical heritage, land-use evaluation, urban sustainable development

Procedia PDF Downloads 379
669 Spectroscopic Relation between Open Cluster and Globular Cluster

Authors: Robin Singh, Mayank Nautiyal, Priyank Jain, Vatasta Koul, Vaibhav Sharma

Abstract:

The curiosity to investigate the space and its mysteries was dependably the main impetus of human interest, as the particle of livings exists from the "debut de l'Univers" (beginning of the Universe) typified with its few other living things. The sharp drive to uncover the secrets of stars and their unusual deportment was dependably an ignitor of stars investigation. As humankind lives in civilizations and states, stars likewise live in provinces named ‘clusters’. Clusters are separates into 2 composes i.e. open clusters and globular clusters. An open cluster is a gathering of thousand stars that were moulded from a comparable goliath sub-nuclear cloud and for the most part; contain Propulsion I (extremely metal-rich) and Propulsion II (mild metal-rich), where globular clusters are around gathering of more than thirty thousand stars that circles a galactic focus and basically contain Propulsion III (to a great degree metal-poor) stars. Futurology of this paper lies in the spectroscopic investigation of globular clusters like M92 and NGC419 and open clusters like M34 and IC2391 in different color bands by using software like VIREO virtual observatory, Aladin, CMUNIWIN, and MS-Excel. Assessing the outcome Hertzsprung-Russel (HR) diagram with exemplary cosmological models like Einstein model, De Sitter and Planck survey demonstrate for a superior age estimation of respective clusters. Colour-Magnitude Diagram of these clusters was obtained by photometric analysis in g and r bands which further transformed into BV bands which will unravel the idea of stars exhibit in the individual clusters.

Keywords: color magnitude diagram, globular clusters, open clusters, Einstein model

Procedia PDF Downloads 226
668 GAILoc: Improving Fingerprinting-Based Localization System Using Generative Artificial Intelligence

Authors: Getaneh Berie Tarekegn

Abstract:

A precise localization system is crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. The most common method for providing continuous positioning services in outdoor environments is by using a global navigation satellite system (GNSS). Due to nonline-of-sight, multipath, and weather conditions, GNSS systems do not perform well in dense urban, urban, and suburban areas.This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a novel semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 39 cm, and more than 90% of the errors are less than 82 cm. That is, numerical results proved that, in comparison to traditional methods, the proposed SRCLoc method can significantly improve positioning performance and reduce radio map construction costs.

Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine

Procedia PDF Downloads 74
667 Modeling and Numerical Simulation of Heat Transfer and Internal Loads at Insulating Glass Units

Authors: Nina Penkova, Kalin Krumov, Liliana Zashcova, Ivan Kassabov

Abstract:

The insulating glass units (IGU) are widely used in the advanced and renovated buildings in order to reduce the energy for heating and cooling. Rules for the choice of IGU to ensure energy efficiency and thermal comfort in the indoor space are well known. The existing of internal loads - gage or vacuum pressure in the hermetized gas space, requires additional attention at the design of the facades. The internal loads appear at variations of the altitude, meteorological pressure and gas temperature according to the same at the process of sealing. The gas temperature depends on the presence of coatings, coating position in the transparent multi-layer system, IGU geometry and space orientation, its fixing on the facades and varies with the climate conditions. An algorithm for modeling and numerical simulation of thermal fields and internal pressure in the gas cavity at insulating glass units as function of the meteorological conditions is developed. It includes models of the radiation heat transfer in solar and infrared wave length, indoor and outdoor convection heat transfer and free convection in the hermetized gas space, assuming the gas as compressible. The algorithm allows prediction of temperature and pressure stratification in the gas domain of the IGU at different fixing system. The models are validated by comparison of the numerical results with experimental data obtained by Hot-box testing. Numerical calculations and estimation of 3D temperature, fluid flow fields, thermal performances and internal loads at IGU in window system are implemented.

Keywords: insulating glass units, thermal loads, internal pressure, CFD analysis

Procedia PDF Downloads 273
666 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function

Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos

Abstract:

Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.

Keywords: diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion process, trends functions, bi-parameters weibull density function

Procedia PDF Downloads 307
665 Comparative Analysis of the Third Generation of Research Data for Evaluation of Solar Energy Potential

Authors: Claudineia Brazil, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Rafael Haag

Abstract:

Renewable energy sources are dependent on climatic variability, so for adequate energy planning, observations of the meteorological variables are required, preferably representing long-period series. Despite the scientific and technological advances that meteorological measurement systems have undergone in the last decades, there is still a considerable lack of meteorological observations that form series of long periods. The reanalysis is a system of assimilation of data prepared using general atmospheric circulation models, based on the combination of data collected at surface stations, ocean buoys, satellites and radiosondes, allowing the production of long period data, for a wide gamma. The third generation of reanalysis data emerged in 2010, among them is the Climate Forecast System Reanalysis (CFSR) developed by the National Centers for Environmental Prediction (NCEP), these data have a spatial resolution of 0.50 x 0.50. In order to overcome these difficulties, it aims to evaluate the performance of solar radiation estimation through alternative data bases, such as data from Reanalysis and from meteorological satellites that satisfactorily meet the absence of observations of solar radiation at global and/or regional level. The results of the analysis of the solar radiation data indicated that the reanalysis data of the CFSR model presented a good performance in relation to the observed data, with determination coefficient around 0.90. Therefore, it is concluded that these data have the potential to be used as an alternative source in locations with no seasons or long series of solar radiation, important for the evaluation of solar energy potential.

Keywords: climate, reanalysis, renewable energy, solar radiation

Procedia PDF Downloads 209
664 Transfer Function Model-Based Predictive Control for Nuclear Core Power Control in PUSPATI TRIGA Reactor

Authors: Mohd Sabri Minhat, Nurul Adilla Mohd Subha

Abstract:

The 1MWth PUSPATI TRIGA Reactor (RTP) in Malaysia Nuclear Agency has been operating more than 35 years. The existing core power control is using conventional controller known as Feedback Control Algorithm (FCA). It is technically challenging to keep the core power output always stable and operating within acceptable error bands for the safety demand of the RTP. Currently, the system could be considered unsatisfactory with power tracking performance, yet there is still significant room for improvement. Hence, a new design core power control is very important to improve the current performance in tracking and regulating reactor power by controlling the movement of control rods that suit the demand of highly sensitive of nuclear reactor power control. In this paper, the proposed Model Predictive Control (MPC) law was applied to control the core power. The model for core power control was based on mathematical models of the reactor core, MPC, and control rods selection algorithm. The mathematical models of the reactor core were based on point kinetics model, thermal hydraulic models, and reactivity models. The proposed MPC was presented in a transfer function model of the reactor core according to perturbations theory. The transfer function model-based predictive control (TFMPC) was developed to design the core power control with predictions based on a T-filter towards the real-time implementation of MPC on hardware. This paper introduces the sensitivity functions for TFMPC feedback loop to reduce the impact on the input actuation signal and demonstrates the behaviour of TFMPC in term of disturbance and noise rejections. The comparisons of both tracking and regulating performance between the conventional controller and TFMPC were made using MATLAB and analysed. In conclusion, the proposed TFMPC has satisfactory performance in tracking and regulating core power for controlling nuclear reactor with high reliability and safety.

Keywords: core power control, model predictive control, PUSPATI TRIGA reactor, TFMPC

Procedia PDF Downloads 241