Search results for: performance prism model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25968

Search results for: performance prism model

17688 Experimental Investigation on Performance of Beam Column Frames with Column Kickers

Authors: Saiada Fuadi Fancy, Fahim Ahmed, Shofiq Ahmed, Raquib Ahsan

Abstract:

The worldwide use of reinforced concrete construction stems from the wide availability of reinforcing steel as well as concrete ingredients. However, concrete construction requires a certain level of technology, expertise, and workmanship, particularly, in the field during construction. As a supporting technology for a concrete column or wall construction, kicker is cast as part of the slab or foundation to provide a convenient starting point for a wall or column ensuring integrity at this important junction. For that reason, a comprehensive study was carried out here to investigate the behavior of reinforced concrete frame with different kicker parameters. To achieve this objective, six half-scale specimens of portal reinforced concrete frame with kickers and one portal frame without kicker were constructed according to common practice in the industry and subjected to cyclic incremental horizontal loading with sustained gravity load. In this study, the experimental data, obtained in four deflections controlled cycle, were used to evaluate the behavior of kickers. Load-displacement characteristics were obtained; maximum loads and deflections were measured and assessed. Finally, the test results of frames constructed with three different types of kicker thickness were compared with the kickerless frame. Similar crack patterns were observed for all the specimens. From this investigation, specimens with kicker thickness 3″ were shown better results than specimens with kicker thickness 1.5″, which was specified by maximum load, stiffness, initiation of first crack and residual displacement. Despite of better performance, it could not be firmly concluded that 4.5″ kicker thickness is the most appropriate one. Because, during the test of that specimen, separation of dial gauge was needed. Finally, comparing with kickerless specimen, it was observed that performance of kickerless specimen was relatively better than kicker specimens.

Keywords: crack, cyclic, kicker, load-displacement

Procedia PDF Downloads 304
17687 5G Future Hyper-Dense Networks: An Empirical Study and Standardization Challenges

Authors: W. Hashim, H. Burok, N. Ghazaly, H. Ahmad Nasir, N. Mohamad Anas, A. F. Ismail, K. L. Yau

Abstract:

Future communication networks require devices that are able to work on a single platform but support heterogeneous operations which lead to service diversity and functional flexibility. This paper proposes two cognitive mechanisms termed cognitive hybrid function which is applied in multiple broadband user terminals in order to maintain reliable connectivity and preventing unnecessary interferences. By employing such mechanisms especially for future hyper-dense network, we can observe their performances in terms of optimized speed and power saving efficiency. Results were obtained from several empirical laboratory studies. It was found that selecting reliable network had shown a better optimized speed performance up to 37% improvement as compared without such function. In terms of power adjustment, our evaluation of this mechanism can reduce the power to 5dB while maintaining the same level of throughput at higher power performance. We also discuss the issues impacting future telecommunication standards whenever such devices get in place.

Keywords: dense network, intelligent network selection, multiple networks, transmit power adjustment

Procedia PDF Downloads 363
17686 An Empirical Examination of the Determinant of the Financial CEOs’ Compensation for the Post-Financial Crisis Period

Authors: Eunsup Daniel Shim, Jooh Lee

Abstract:

The US financial crisis of 2008 and subsequent Global Financial Crisis were considered by many economists the worst financial crisis since the Great Depression of the 1930s. As a results, Dodd-Frank Act has passed and aims '(1) to promote the financial stability of the United States by improving accountability and transparency in the financial system, to end "too big to fail", (2) to protect the American taxpayer by ending bailouts, (3) to protect consumers from abusive financial services practices, and for other purposes.' The enactment of Dodd-Frank Act, in part, intended to significantly influence accountability on executive compensation especially for the financial institutions. This paper empirically investigates the changes in Financial CEOs’ compensation since the Financial Crisis of 2008. Our findings show that in the post- Financial Crisis period financial leverage is significant factor influencing the CEOs’ total compensation. In addition market based performance such as stock price and market-to-book ratio shows significant positive relationship with CEO compensation. This change can be interpreted an attempt to reduce opportunistic behavior of top executives after the financial crisis and the enactment of the Dodd-Frank Act.

Keywords: financial CEO compensation, firm performance, financial crisis of 2008, dodd-frank act

Procedia PDF Downloads 506
17685 Hybrid Model of an Increasing Unique Consumer Value on Purchases that Influences the Consumer Loyalty and the Pursuit of a Sustainable Competitive Advantage from the Institutions in Jakarta

Authors: Wilhelmus Hary Susilo

Abstract:

The marketplace would have at least some resources that are unique (e.g., well communication, knowledgeable employees, consumer value, effective transaction, efficient production processes and institutional branding). The institutions should have an advantage in resources and then could lead to positions of competitive advantage. These major challenges focus on increasing unique consumer value on reliable purchases that influence of loyalty and pursuit of a sustainable competitive advantage from the Institutions in Jakarta. Furthermore, a research was conducted with a quantitative method and a confirmatory strategic research design. The research resulted in entire confirmatory factors analysis (1st CFA and 2nd CFA) among variables pertains to; χ2//Df (9.30, 4.38, 6.95, 2.76, 2.97, 2.91, 2.32 and 6.90), GFI (0.72, 0.82, 0.82, 0.81, 0.78, 0.84, 0.89 and 0.70) and CFI (0.90, 0.95, 0.93, 0.92, 0.95, 0.91, 0.96 and 0.89), which indicates a good model. Furthermore, the hybrid model is well fit with, χ2//Df=1.84, P value = 0.00, RMSEA = 0.076, GFI = 0.76, NNFI= 0.95, PNFI= 0.82, IFI= 0.96, RFI= 0.91, AGFI= 0.71 and CFI= 0.96. The result was significant hypothesis, i.e. variables of communitization marketing 3.0 and price perception influenced to unique value of consumer with tvalue =4.46 and 5.89. Furthermore, the consumers value influenced the purchasing with t value = 5.94. Additionally, the loyalty, the ‘communitization’, and the character building marketing 3.0 are affecting the pursuit of a sustainable competitive advantage from institutions with t value = 7.57, -2.12, and 2.04. Finally, the test between the most superior variable dimensions is significantly correlated between INOV and WDES, RESPON and ATT covariance matrix value= 0.72 and 0.71. Thus, ‘communitization’ and character building marketing 3.0 with dimensions of responsibility and technologies would increase a competitive advantage with the dimensions of the innovation and the job design from the institutions.

Keywords: consumer loyalty, marketing 3.0, unique consumer value, purchase, sustainable competitive advantage

Procedia PDF Downloads 276
17684 Influence of Flexible Plate's Contour on Dynamic Behavior of High Speed Flexible Coupling of Combat Aircraft

Authors: Dineshsingh Thakur, S. Nagesh, J. Basha

Abstract:

A lightweight High Speed Flexible Coupling (HSFC) is used to connect the Engine Gear Box (EGB) with an Accessory Gear Box (AGB) of the combat aircraft. The HSFC transmits the power at high speeds ranging from 10000 to 18000 rpm from the EGB to AGB. The HSFC is also accommodates larger misalignments resulting from thermal expansion of the aircraft engine and mounting arrangement. The HSFC has the series of metallic contoured annular thin cross-sectioned flexible plates to accommodate the misalignments. The flexible plates are accommodating the misalignment by the elastic material flexure. As the HSFC operates at higher speed, the flexural and axial resonance frequencies are to be kept away from the operating speed and proper prediction is required to prevent failure in the transmission line of a single engine fighter aircraft. To study the influence of flexible plate’s contour on the lateral critical speed (LCS) of HSFC, a mathematical model of HSFC as a elven rotor system is developed. The flexible plate being the bending member of the system, its bending stiffness which results from the contoured governs the LCS. Using transfer matrix method, Influence of various flexible plate contours on critical speed is analyzed. In the above analysis, the support bearing flexibility on critical speed prediction is also considered. Based on the study, a model is built with the optimum contour of flexible plate, for validation by experimental modal analysis. A good correlation between the theoretical prediction and model behavior is observed. From the study, it is found that the flexible plate’s contour is playing vital role in modification of system’s dynamic behavior and the present model can be extended for the development of similar type of flexible couplings for its computational simplicity and reliability.

Keywords: flexible rotor, critical speed, experimental modal analysis, high speed flexible coupling (HSFC), misalignment

Procedia PDF Downloads 200
17683 Improving an Automotive Bumper Structure for Pedestrian Protection

Authors: Mohammad Hassan Shojaeefard, Abolfazl Khalkhali, Khashayar Ghadirinejad

Abstract:

In the present study, first, a three-dimensional finite element model of lower legform impactor according to the pedestrian protection regulation EC 78/2009 is carried out. The FE model of lower legform impactor then validated on static and dynamic tests by three main criteria which are bending angle, shear displacement and upper tibia acceleration. At the second step, the validated impactor is employed to evaluate bumper of a B-class automotive based on pedestrian protection criteria defined in EC regulation. Finally, based on some investigations an improved design for the bumper is then represented and compared with the base design. Results show that very good improvement in meeting the pedestrian protection criteria is achieved.

Keywords: pedestrian protection, legform impactor, automotive bumper, finite element method

Procedia PDF Downloads 238
17682 Preconcentration and Determination of Cyproheptadine in Biological Samples by Hollow Fiber Liquid Phase Microextraction Coupled with High Performance Liquid Chromatography

Authors: Sh. Najari Moghadam, M. Qomi, F. Raofie, J. Khadiv

Abstract:

In this study, a liquid phase microextraction by hollow fiber (HF-LPME) combined with high performance liquid chromatography-UV detector was applied to preconcentrate and determine trace levels of Cyproheptadine in human urine and plasma samples. Cyproheptadine was extracted from 10 mL alkaline aqueous solution (pH: 9.81) into an organic solvent (n-octnol) which was immobilized in the wall pores of a hollow fiber. Then, it was back-extracted into an acidified aqueous solution (pH: 2.59) located inside the lumen of the hollow fiber. This method is simple, efficient and cost-effective. It is based on pH gradient and differences between two aqueous phases. In order to optimize the HF-LPME, some affecting parameters including the pH of donor and acceptor phases, the type of organic solvent, ionic strength, stirring rate, extraction time and temperature were studied and optimized. Under optimal conditions enrichment factor, limit of detection (LOD) and relative standard deviation (RSD(%), n=3) were up to 112, 15 μg.L−1 and 2.7, respectively.

Keywords: biological samples, cyproheptadine, hollow fiber, liquid phase microextraction

Procedia PDF Downloads 273
17681 A Comparative Analysis of Classification Models with Wrapper-Based Feature Selection for Predicting Student Academic Performance

Authors: Abdullah Al Farwan, Ya Zhang

Abstract:

In today’s educational arena, it is critical to understand educational data and be able to evaluate important aspects, particularly data on student achievement. Educational Data Mining (EDM) is a research area that focusing on uncovering patterns and information in data from educational institutions. Teachers, if they are able to predict their students' class performance, can use this information to improve their teaching abilities. It has evolved into valuable knowledge that can be used for a wide range of objectives; for example, a strategic plan can be used to generate high-quality education. Based on previous data, this paper recommends employing data mining techniques to forecast students' final grades. In this study, five data mining methods, Decision Tree, JRip, Naive Bayes, Multi-layer Perceptron, and Random Forest with wrapper feature selection, were used on two datasets relating to Portuguese language and mathematics classes lessons. The results showed the effectiveness of using data mining learning methodologies in predicting student academic success. The classification accuracy achieved with selected algorithms lies in the range of 80-94%. Among all the selected classification algorithms, the lowest accuracy is achieved by the Multi-layer Perceptron algorithm, which is close to 70.45%, and the highest accuracy is achieved by the Random Forest algorithm, which is close to 94.10%. This proposed work can assist educational administrators to identify poor performing students at an early stage and perhaps implement motivational interventions to improve their academic success and prevent educational dropout.

Keywords: classification algorithms, decision tree, feature selection, multi-layer perceptron, Naïve Bayes, random forest, students’ academic performance

Procedia PDF Downloads 151
17680 A Two Level Load Balancing Approach for Cloud Environment

Authors: Anurag Jain, Rajneesh Kumar

Abstract:

Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.

Keywords: cloud analyst, cloud computing, join idle queue, join shortest queue, load balancing, task scheduling

Procedia PDF Downloads 414
17679 Modified Bat Algorithm for Economic Load Dispatch Problem

Authors: Daljinder Singh, J.S.Dhillon, Balraj Singh

Abstract:

According to no free lunch theorem, a single search technique cannot perform best in all conditions. Optimization method can be attractive choice to solve optimization problem that may have exclusive advantages like robust and reliable performance, global search capability, little information requirement, ease of implementation, parallelism, no requirement of differentiable and continuous objective function. In order to synergize between exploration and exploitation and to further enhance the performance of Bat algorithm, the paper proposed a modified bat algorithm that adds additional search procedure based on bat’s previous experience. The proposed algorithm is used for solving the economic load dispatch (ELD) problem. The practical constraint such valve-point loading along with power balance constraints and generator limit are undertaken. To take care of power demand constraint variable elimination method is exploited. The proposed algorithm is tested on various ELD problems. The results obtained show that the proposed algorithm is capable of performing better in majority of ELD problems considered and is at par with existing algorithms for some of problems.

Keywords: bat algorithm, economic load dispatch, penalty method, variable elimination method

Procedia PDF Downloads 450
17678 Understanding the Impact of Li- bis(trifluoromethanesulfonyl)imide Doping on Spiro-OMeTAD Properties and Perovskite Solar Cell Performance

Authors: Martin C. Eze, Gao Min

Abstract:

Lithium bis(trifluoromethanesulfonyl)imide (Li-TFSI) dopant is beneficial in improving the properties of 2,2′,7,7′-Tetrakis (N, N-di-p-methoxyphenylamino)-9,9′-spiro-bifluorene (Spiro-OMETAD) transport layer used in perovskite solar cells (PSCs). Properties such as electrical conductivity, band energy mismatch, and refractive index of Spiro-OMETAD layers are believed to play key roles in PSCs performance but only the dependence of electrical conductivity on Li-TFSI doping has been extensively studied. In this work, the effect of Li-TFSI doping level on highest occupied molecular orbital (HOMO) energy, electrical conductivity, and refractive index of Spiro-OMETAD film and PSC performance was demonstrated. The Spiro-OMETAD films were spin-coated at 4000 rpm for 30 seconds from solutions containing 73.4 mM of Spiro-OMeTAD, 23.6 mM of 4-tert-butylpyridine, 7.6 mM of tris(2-(1H-pyrazol-1-yl)-4-tert-butylpyridine) cobalt(III) tri[bis(trifluoromethane) sulfonimide] (FK209) dopant and Li-TFSI dopant varying from 37 to 62 mM in 1 ml of chlorobenzene. From ultraviolet photoelectron spectroscopy (UPS), ellipsometry, and 4-probe studies, the results show that films deposition from Spiro-OMETAD solution doped with 40 mM of Li-TFSI shows the highest electrical conductivity of 6.35×10-6 S/cm, the refractive index of 1.87 at 632.32 nm, HOMO energy of -5.22 eV and the lowest HOMO energy mismatch of 0.21 eV compared to HOMO energy of perovskite layer. The PSCs fabricated show the best power conversion efficiency, open-circuit voltage, and fill factor of 17.10 %, 1.1 V, and 70.12%, respectively, for devices based on Spiro-OMETAD solution doped with 40 mM of Li-TFSI. This study demonstrates that the optimum Spiro-OMETAD/ Li-TFSI doping ratio of 1.84 is the optimum doping level for Spiro-OMETAD layer preparation.

Keywords: electrical conductivity, homo energy mismatch, lithium bis(trifluoromethanesulfonyl)imide, power conversion efficiency, refractive index

Procedia PDF Downloads 111
17677 Numerical Simulation of Fiber Bragg Grating Spectrum for Mode-І Delamination Detection

Authors: O. Hassoon, M. Tarfoui, A. El Malk

Abstract:

Fiber Bragg optic sensor embedded in composite material to detect and monitor the damage which is occur in composite structure. In this paper we deal with the mode-Ι delamination to determine the resistance of material to crack propagation, and use the coupling mode theory and T-matrix method to simulating the FBGs spectrum for both uniform and non-uniform strain distribution. The double cantilever beam test which is modeling in FEM to determine the Longitudinal strain, there are two models which are used, the first is the global half model, and the second the sub-model to represent the FBGs with refine mesh. This method can simulate the damage in the composite structure and converting the strain to wavelength shifting of the FBG spectrum.

Keywords: fiber bragg grating, delamination detection, DCB, FBG spectrum, structure health monitoring

Procedia PDF Downloads 350
17676 Evaluation of the Impact of Reducing the Traffic Light Cycle for Cars to Improve Non-Vehicular Transportation: A Case of Study in Lima

Authors: Gheyder Concha Bendezu, Rodrigo Lescano Loli, Aldo Bravo Lizano

Abstract:

In big urbanized cities of Latin America, motor vehicles have priority over non-motor vehicles and pedestrians. There is an important problem that affects people's health and quality of life; lack of inclusion towards pedestrians makes it difficult for them to move smoothly and safely since the city has been planned for the transit of motor vehicles. Faced with the new trend for sustainable and economical transport, the city is forced to develop infrastructure in order to incorporate pedestrians and users with non-motorized vehicles in the transport system. The present research aims to study the influence of non-motorized vehicles on an avenue, the optimization of a cycle using traffic lights based on simulation in Synchro software, to improve the flow of non-motor vehicles. The evaluation is of the microscopic type; for this reason, field data was collected, such as vehicular, pedestrian, and non-motor vehicle user demand. With the values of speed and travel time, it is represented in the current scenario that contains the existing problem. These data allow to create a microsimulation model in Vissim software, later to be calibrated and validated so that it has a behavior similar to reality. The results of this model are compared with the efficiency parameters of the proposed model; these parameters are the queue length, the travel speed, and mainly the travel times of the users at this intersection. The results reflect a reduction of 27% in travel time, that is, an improvement between the proposed model and the current one for this great avenue. The tail length of motor vehicles is also reduced by 12.5%, a considerable improvement. All this represents an improvement in the level of service and in the quality of life of users.

Keywords: bikeway, microsimulation, pedestrians, queue length, traffic light cycle, travel time

Procedia PDF Downloads 147
17675 Use of Cassava Waste and Its Energy Potential

Authors: I. Inuaeyen, L. Phil, O. Eni

Abstract:

Fossil fuels have been the main source of global energy for many decades, accounting for about 80% of global energy need. This is beginning to change however with increasing concern about greenhouse gas emissions which comes mostly from fossil fuel combustion. Greenhouse gases such as carbon dioxide are responsible for stimulating climate change. As a result, there has been shift towards more clean and renewable energy sources of energy as a strategy for stemming greenhouse gas emission into the atmosphere. The production of bio-products such as bio-fuel, bio-electricity, bio-chemicals, and bio-heat etc. using biomass materials in accordance with the bio-refinery concept holds a great potential for reducing high dependence on fossil fuel and their resources. The bio-refinery concept promotes efficient utilisation of biomass material for the simultaneous production of a variety of products in order to minimize or eliminate waste materials. This will ultimately reduce greenhouse gas emissions into the environment. In Nigeria, cassava solid waste from cassava processing facilities has been identified as a vital feedstock for bio-refinery process. Cassava is generally a staple food in Nigeria and one of the most widely cultivated foodstuff by farmers across Nigeria. As a result, there is an abundant supply of cassava waste in Nigeria. In this study, the aim is to explore opportunities for converting cassava waste to a range of bio-products such as butanol, ethanol, electricity, heat, methanol, furfural etc. using a combination of biochemical, thermochemical and chemical conversion routes. . The best process scenario will be identified through the evaluation of economic analysis, energy efficiency, life cycle analysis and social impact. The study will be carried out by developing a model representing different process options for cassava waste conversion to useful products. The model will be developed using Aspen Plus process simulation software. Process economic analysis will be done using Aspen Icarus software. So far, comprehensive survey of literature has been conducted. This includes studies on conversion of cassava solid waste to a variety of bio-products using different conversion techniques, cassava waste production in Nigeria, modelling and simulation of waste conversion to useful products among others. Also, statistical distribution of cassava solid waste production in Nigeria has been established and key literatures with useful parameters for developing different cassava waste conversion process has been identified. In the future work, detailed modelling of the different process scenarios will be carried out and the models validated using data from literature and demonstration plants. A techno-economic comparison of the various process scenarios will be carried out to identify the best scenario using process economics, life cycle analysis, energy efficiency and social impact as the performance indexes.

Keywords: bio-refinery, cassava waste, energy, process modelling

Procedia PDF Downloads 354
17674 Numerical Simulation of Unsteady Cases of Fluid Flow Using Modified Dynamic Boundary Condition (mDBC) in Smoothed Particle Hydrodynamics Models

Authors: Exa Heydemans, Jessica Sjah, Dwinanti Rika Marthanty

Abstract:

This paper presents numerical simulations using an open boundary algorithm with modified dynamic boundary condition (mDBC) for weakly compressible smoothed particle hydrodynamics models from particle-based code Dualsphysics. The problems of piping erosion in dams and dikes are aimed for studying the algorithm. The case 2D model of unsteady fluid flow past around a fixed cylinder is simulated, where various values of Reynold’s numbers (Re40, Re60, Re80, and Re100) and different model’s resolution are considered. A constant velocity with different values of viscosity for generating various Reynold’s numbers and different numbers of particles over a cylinder for the resolution are modeled. The interaction between solid particles of the cylinder and fluid particles is concerned. The cylinder is affected by the hydrodynamics force caused by the flow of fluid particles. The solid particles of the cylinder are the observation points to obtain force and pressure due to the hydrodynamics forces. As results of the simulation, which is to show the capability to model 2D unsteady with various Reynold’s numbers, the pressure coefficient, drag coefficient, lift coefficient, and Strouhal number are compared to the previous work from literature.

Keywords: hydrodynamics, internal erosion, dualsphysics, viscous fluid flow

Procedia PDF Downloads 145
17673 Technical Parameters Evaluation for Caps to Apucarana/Parana - Brazil APL

Authors: Cruz, G. P., Nagamatsu, R. N., Scacchetti, F. A. P., Merlin, F. K.

Abstract:

This study aims to assess a set of technical parameters that provide quality products to the companies that produce caps, APL Apucarana / PR, the city that produces most Brazilian caps, in order to verify the potential of Brazilian caps to compete with international brands, recognized by the standard of excellence when it comes to quality of its products. The determination of the technical parameters was arbitrated from textile ABNT, a total of six technical parameters, providing eight tests for cotton caps. For the evaluation, we used as reference a leading brand recognized worldwide (based on their sales volume in $) for comparison with 3 companies of the APL Apucarana. The results showed that, of the 8 tests, of 8 tests, the companies Apucarana did not obtain better performance than the competitor. They obtained the same results in three tests and lower performance in 5. Given these values, it is concluded that local caps are not far from reaching the quality of leading brand. It is recommended that the APL companies use the parameters to evaluate their products, using this information to support decision-making that seek to improve both the product design and its production process, enabling the feasibility for faster international recognition . Thus, they may have an edge over its main competitor.

Keywords: technical parameters, making caps, quality, evaluation

Procedia PDF Downloads 329
17672 Seismic Response Mitigation of Structures Using Base Isolation System Considering Uncertain Parameters

Authors: Rama Debbarma

Abstract:

The present study deals with the performance of Linear base isolation system to mitigate seismic response of structures characterized by random system parameters. This involves optimization of the tuning ratio and damping properties of the base isolation system considering uncertain system parameters. However, the efficiency of base isolator may reduce if it is not tuned to the vibrating mode it is designed to suppress due to unavoidable presence of system parameters uncertainty. With the aid of matrix perturbation theory and first order Taylor series expansion, the total probability concept is used to evaluate the unconditional response of the primary structures considering random system parameters. For this, the conditional second order information of the response quantities are obtained in random vibration framework using state space formulation. Subsequently, the maximum unconditional root mean square displacement of the primary structures is used as the objective function to obtain optimum damping parameters Numerical study is performed to elucidate the effect of parameters uncertainties on the optimization of parameters of linear base isolator and system performance.

Keywords: linear base isolator, earthquake, optimization, uncertain parameters

Procedia PDF Downloads 409
17671 Development of Recycled-Modified Asphalt Using Basalt Aggregate

Authors: Dong Wook Lee, Seung Hyun Kim, Jeongho Oh

Abstract:

With the strengthened regulation on the mandatory use of recycled aggregate, development of construction materials using recycled aggregate has recently increased. This study aimed to secure the performance of asphalt concrete mixture by developing recycled-modified asphalt using recycled basalt aggregate from the Jeju area. The strength of the basalt aggregate from the Jeju area used in this study was similar to that of general aggregate, while the specific surface area was larger due to the development of pores. Modified asphalt was developed using a general aggregate-recycled aggregate ratio of 7:3, and the results indicated that the Marshall stability increased by 27% compared to that of asphalt concrete mixture using only general aggregate, and the flow values showed similar levels. Also, the indirect tensile strength increased by 79%, and the toughness increased by more than 100%. In addition, the TSR for examining moisture resistance was 0.95 indicating that the reduction in the indirect tensile strength due to moisture was very low (5% level), and the developed recycled-modified asphalt could satisfy all the quality standards of asphalt concrete mixture.

Keywords: asphalt concrete mixture, performance grade, recycled basalt aggregate, recycled-modified asphalt

Procedia PDF Downloads 342
17670 A Dynamic Model for Circularity Assessment of Nutrient Recovery from Domestic Sewage

Authors: Anurag Bhambhani, Jan Peter Van Der Hoek, Zoran Kapelan

Abstract:

The food system depends on the availability of Phosphorus (P) and Nitrogen (N). Growing population, depleting Phosphorus reserves and energy-intensive industrial nitrogen fixation are threats to their future availability. Recovering P and N from domestic sewage water offers a solution. Recovered P and N can be applied to agricultural land, replacing virgin P and N. Thus, recovery from sewage water offers a solution befitting a circular economy. To ensure minimum waste and maximum resource efficiency a circularity assessment method is crucial to optimize nutrient flows and minimize losses. Material Circularity Indicator (MCI) is a useful method to quantify the circularity of materials. It was developed for materials that remain within the market and recently extended to include biotic materials that may be composted or used for energy recovery after end-of-use. However, MCI has not been used in the context of nutrient recovery. Besides, MCI is time-static, i.e., it cannot account for dynamic systems such as the terrestrial nutrient cycles. Nutrient application to agricultural land is a highly dynamic process wherein flows and stocks change with time. The rate of recycling of nutrients in nature can depend on numerous factors such as prevailing soil conditions, local hydrology, the presence of animals, etc. Therefore, a dynamic model of nutrient flows with indicators is needed for the circularity assessment. A simple substance flow model of P and N will be developed with the help of flow equations and transfer coefficients that incorporate the nutrient recovery step along with the agricultural application, the volatilization and leaching processes, plant uptake and subsequent animal and human uptake. The model is then used for calculating the proportions of linear and restorative flows (coming from reused/recycled sources). The model will simulate the adsorption process based on the quantity of adsorbent and nutrient concentration in the water. Thereafter, the application of the adsorbed nutrients to agricultural land will be simulated based on adsorbate release kinetics, local soil conditions, hydrology, vegetation, etc. Based on the model, the restorative nutrient flow (returning to the sewage plant following human consumption) will be calculated. The developed methodology will be applied to a case study of resource recovery from wastewater. In the aforementioned case study located in Italy, biochar or zeolite is to be used for recovery of P and N from domestic sewage through adsorption and thereafter, used as a slow-release fertilizer in agriculture. Using this model, information regarding the efficiency of nutrient recovery and application can be generated. This can help to optimize the recovery process and application of the nutrients. Consequently, this will help to optimize nutrient recovery and application and reduce the dependence of the food system on the virgin extraction of P and N.

Keywords: circular economy, dynamic substance flow, nutrient cycles, resource recovery from water

Procedia PDF Downloads 187
17669 A Model of Empowerment Evaluation of Knowledge Management in Private Banks Using Fuzzy Inference System

Authors: Nazanin Pilevari, Kamyar Mahmoodi

Abstract:

The purpose of this research is to provide a model based on fuzzy inference system for evaluating empowerment of Knowledge management. The first prototype of the research was developed based on the study of literature. In the next step, experts were provided with these models and after implementing consensus-based reform, the views of Fuzzy Delphi experts and techniques, components and Index research model were finalized. Culture, structure, IT and leadership were considered as dimensions of empowerment. Then, In order to collect and extract data for fuzzy inference system based on knowledge and Experience, the experts were interviewed. The values obtained from designed fuzzy inference system, made review and assessment of the organization's empowerment of Knowledge management possible. After the design and validation of systems to measure indexes ,empowerment of Knowledge management and inputs into fuzzy inference) in the AYANDEH Bank, a questionnaire was used. In the case of this bank, the system output indicates that the status of empowerment of Knowledge management, culture, organizational structure and leadership are at the moderate level and information technology empowerment are relatively high. Based on these results, the status of knowledge management empowerment in AYANDE Bank, was moderate. Eventually, some suggestions for improving the current situation of banks were provided. According to studies of research history, the use of powerful tools in Fuzzy Inference System for assessment of Knowledge management and knowledge management empowerment such an assessment in the field of banking, are the innovation of this Research.

Keywords: knowledge management, knowledge management empowerment, fuzzy inference system, fuzzy Delphi

Procedia PDF Downloads 346
17668 Effects of Aging on Auditory and Visual Recall Abilities

Authors: Rashmi D. G., Aishwarya G., Niharika M. K.

Abstract:

Purpose: Free recall tasks target cognitive and linguistic processes like episodic memory, lexical access and retrieval. Consequently, the free recall paradigm is suitable for assessing memory deterioration caused by aging; this also depends on linguistic factors, including the use of first and second languages and their relative ability. Hence, the present study aimed to determine if aging has an effect on visual and auditory recall abilities. Method: Twenty young adults (mean age: 25.4±0.99) and older adults (mean age: 63.3±3.51) participated in the study. Participants performed a free recall task under two conditions – related and unrelated and two modalities - visual and auditory where they were instructed to recall as many items as possible with no specific order and time limit. Results: Free recall performance was calculated as the mean number of correctly recalled items. Although younger participants recalled a higher number of items, the performance across conditions and modality was variable. Conclusion: In summary, the findings of the present study revealed an age-related decline in the efficiency of episodic memory, which is crucial to remember recent events.

Keywords: recall, episodic memory, aging, modality

Procedia PDF Downloads 80
17667 A Biomechanical Model for the Idiopathic Scoliosis Using the Antalgic-Trak Technology

Authors: Joao Fialho

Abstract:

The mathematical modelling of idiopathic scoliosis has been studied throughout the years. The models presented on those papers are based on the orthotic stabilization of the idiopathic scoliosis, which are based on a transversal force being applied to the human spine on a continuous form. When considering the ATT (Antalgic-Trak Technology) device, the existent models cannot be used, as the type of forces applied are no longer transversal nor applied in a continuous manner. In this device, vertical traction is applied. In this study we propose to model the idiopathic scoliosis, using the ATT (Antalgic-Trak Technology) device, and with the parameters obtained from the mathematical modeling, set up a case-by-case individualized therapy plan, for each patient.

Keywords: idiopathic scoliosis, mathematical modelling, human spine, Antalgic-Trak technology

Procedia PDF Downloads 254
17666 Optimization of Hydraulic Fracturing for Horizontal Wells in Enhanced Geothermal Reservoirs

Authors: Qudratullah Muradi

Abstract:

Geothermal energy is a renewable energy source that can be found in abundance on our planet. Only a small fraction of it is currently converted to electrical power, though in recent years installed geothermal capacity has increased considerably all over the world. In this paper, we assumed a model for designing of Enhanced Geothermal System, EGS. We used computer modeling group, CMG reservoir simulation software to create the typical Hot Dry Rock, HDR reservoir. In this research two wells, one injection of cold water and one production of hot water are included in the model. There are some hydraulic fractures created by the mentioned software. And cold water is injected in order to produce energy from the reservoir. The result of injecting cold water to the reservoir and extracting geothermal energy is defined by some graphs at the end of this research. The production of energy is quantified in a period of 10 years.

Keywords: geothermal energy, EGS, HDR, hydraulic fracturing

Procedia PDF Downloads 178
17665 Balancing and Synchronization Control of a Two Wheel Inverted Pendulum Vehicle

Authors: Shiuh-Jer Huang, Shin-Ham Lee, Sheam-Chyun Lin

Abstract:

A two wheel inverted pendulum (TWIP) vehicle is built with two hub DC motors for motion control evaluation. Arduino Nano micro-processor is chosen as the control kernel for this electric test plant. Accelerometer and gyroscope sensors are built in to measure the tilt angle and angular velocity of the inverted pendulum vehicle. Since the TWIP has significantly hub motor dead zone and nonlinear system dynamics characteristics, the vehicle system is difficult to control by traditional model based controller. The intelligent model-free fuzzy sliding mode controller (FSMC) was employed as the main control algorithm. Then, intelligent controllers are designed for TWIP balance control, and two wheels synchronization control purposes.

Keywords: balance control, synchronization control, two-wheel inverted pendulum, TWIP

Procedia PDF Downloads 375
17664 Testing Plastic-Sand Construction Blocks Made from Recycled Polyethylene Terephthalate (rPET)

Authors: Cassi Henderson, Lucia Corsini, Shiv Kapila, Egle Augustaityte, Tsemaye Uwejamomere Zinzan Gurney, Aleyna Yildirim

Abstract:

Plastic pollution is a major threat to human and planetary health. In Low- and Middle-Income Countries, plastic waste poses a major problem for marginalized populations who lack access to formal waste management systems. This study explores the potential for converting waste plastic into construction blocks. It is the first study to analyze the use of polyethylene terephthalate (PET) as a binder in plastic-sand bricks. Unlike previous studies of plastic sand-bricks, this research tests the properties of bricks that were made using a low-cost kiln technology that was co-designed with a rural, coastal community in Kenya.  The mechanical strength, resistance to fire and water absorption properties of the bricks are tested in this study. The findings show that the bricks meet structural standards for mechanical performance, fire resistance and water absorption. It was found that 30:70 PET to sand demonstrated the best overall performance.

Keywords: recycling, PET, plastic, sustainable construction, sustainable development

Procedia PDF Downloads 111
17663 A New Framework for ECG Signal Modeling and Compression Based on Compressed Sensing Theory

Authors: Siavash Eftekharifar, Tohid Yousefi Rezaii, Mahdi Shamsi

Abstract:

The purpose of this paper is to exploit compressed sensing (CS) method in order to model and compress the electrocardiogram (ECG) signals at a high compression ratio. In order to obtain a sparse representation of the ECG signals, first a suitable basis matrix with Gaussian kernels, which are shown to nicely fit the ECG signals, is constructed. Then the sparse model is extracted by applying some optimization technique. Finally, the CS theory is utilized to obtain a compressed version of the sparse signal. Reconstruction of the ECG signal from the compressed version is also done to prove the reliability of the algorithm. At this stage, a greedy optimization technique is used to reconstruct the ECG signal and the Mean Square Error (MSE) is calculated to evaluate the precision of the proposed compression method.

Keywords: compressed sensing, ECG compression, Gaussian kernel, sparse representation

Procedia PDF Downloads 445
17662 An Adaptive Oversampling Technique for Imbalanced Datasets

Authors: Shaukat Ali Shahee, Usha Ananthakumar

Abstract:

A data set exhibits class imbalance problem when one class has very few examples compared to the other class, and this is also referred to as between class imbalance. The traditional classifiers fail to classify the minority class examples correctly due to its bias towards the majority class. Apart from between-class imbalance, imbalance within classes where classes are composed of a different number of sub-clusters with these sub-clusters containing different number of examples also deteriorates the performance of the classifier. Previously, many methods have been proposed for handling imbalanced dataset problem. These methods can be classified into four categories: data preprocessing, algorithmic based, cost-based methods and ensemble of classifier. Data preprocessing techniques have shown great potential as they attempt to improve data distribution rather than the classifier. Data preprocessing technique handles class imbalance either by increasing the minority class examples or by decreasing the majority class examples. Decreasing the majority class examples lead to loss of information and also when minority class has an absolute rarity, removing the majority class examples is generally not recommended. Existing methods available for handling class imbalance do not address both between-class imbalance and within-class imbalance simultaneously. In this paper, we propose a method that handles between class imbalance and within class imbalance simultaneously for binary classification problem. Removing between class imbalance and within class imbalance simultaneously eliminates the biases of the classifier towards bigger sub-clusters by minimizing the error domination of bigger sub-clusters in total error. The proposed method uses model-based clustering to find the presence of sub-clusters or sub-concepts in the dataset. The number of examples oversampled among the sub-clusters is determined based on the complexity of sub-clusters. The method also takes into consideration the scatter of the data in the feature space and also adaptively copes up with unseen test data using Lowner-John ellipsoid for increasing the accuracy of the classifier. In this study, neural network is being used as this is one such classifier where the total error is minimized and removing the between-class imbalance and within class imbalance simultaneously help the classifier in giving equal weight to all the sub-clusters irrespective of the classes. The proposed method is validated on 9 publicly available data sets and compared with three existing oversampling techniques that rely on the spatial location of minority class examples in the euclidean feature space. The experimental results show the proposed method to be statistically significantly superior to other methods in terms of various accuracy measures. Thus the proposed method can serve as a good alternative to handle various problem domains like credit scoring, customer churn prediction, financial distress, etc., that typically involve imbalanced data sets.

Keywords: classification, imbalanced dataset, Lowner-John ellipsoid, model based clustering, oversampling

Procedia PDF Downloads 402
17661 Efficient Credit Card Fraud Detection Based on Multiple ML Algorithms

Authors: Neha Ahirwar

Abstract:

In the contemporary digital era, the rise of credit card fraud poses a significant threat to both financial institutions and consumers. As fraudulent activities become more sophisticated, there is an escalating demand for robust and effective fraud detection mechanisms. Advanced machine learning algorithms have become crucial tools in addressing this challenge. This paper conducts a thorough examination of the design and evaluation of a credit card fraud detection system, utilizing four prominent machine learning algorithms: random forest, logistic regression, decision tree, and XGBoost. The surge in digital transactions has opened avenues for fraudsters to exploit vulnerabilities within payment systems. Consequently, there is an urgent need for proactive and adaptable fraud detection systems. This study addresses this imperative by exploring the efficacy of machine learning algorithms in identifying fraudulent credit card transactions. The selection of random forest, logistic regression, decision tree, and XGBoost for scrutiny in this study is based on their documented effectiveness in diverse domains, particularly in credit card fraud detection. These algorithms are renowned for their capability to model intricate patterns and provide accurate predictions. Each algorithm is implemented and evaluated for its performance in a controlled environment, utilizing a diverse dataset comprising both genuine and fraudulent credit card transactions.

Keywords: efficient credit card fraud detection, random forest, logistic regression, XGBoost, decision tree

Procedia PDF Downloads 44
17660 Mathematical Modeling and Analysis of COVID-19 Pandemic

Authors: Thomas Wetere

Abstract:

Background: The coronavirus disease 2019 (COVID-19) pandemic (COVID-19) virus infection is a severe infectious disease with the highly transmissible variant, which become the global public health treat now. It has taken the life of more than 4 million people so far. What makes the disease the worst of all is no specific effective treatment available, its dynamics is not much researched and understood. Methodology: To end the global COVID-19 pandemic, implementation of multiple population-wide strategies, including vaccination, environmental factors, Government action, testing, and contact tracing, is required. In this article, a new mathematical model incorporating both temperature and government action to study the dynamics of the COVID-19 pandemic has been developed and comprehensively analysed. The model considers eight stages of infection: susceptible (S), infected Asymptomatic and Undetected(IAU ), infected Asymptomatic and detected(IAD), infected symptomatic and Undetected(ISU ), infected Symptomatic and detected(ISD), Hospitalized or threatened(H), Recovered(R) and Died(D). Results: The existence as well as non-negativity of the solution to the model is also verified, and the basic reproduction number is calculated. Besides, stability conditions are also checked, and finally, simulation results are compared with real data. The results demonstrates that effective government action will need to be combined with vaccination to end the ongoing COVID-19 pandemic. Conclusion: Vaccination and Government action are highly the crucial measures to control the COVID-19 pandemic. Besides, as the cost of vaccination might be high, we recommend an optimal control to reduce the cost and number of infected individuals. Moreover, in order to prevent COVID-19 pandemic, through the analysis of the model, the government must strictly manage the policy on COVID-19 and carry it out. This, in turn, helps for health campaigning and raising health literacy which plays a role to control the quick spread of the disease. We finally strongly believe that our study will play its own role in the current effort of controlling the pandemic.

Keywords: modeling, COVID-19, MCMC, stability

Procedia PDF Downloads 95
17659 Fast and Scale-Adaptive Target Tracking via PCA-SIFT

Authors: Yawen Wang, Hongchang Chen, Shaomei Li, Chao Gao, Jiangpeng Zhang

Abstract:

As the main challenge for target tracking is accounting for target scale change and real-time, we combine Mean-Shift and PCA-SIFT algorithm together to solve the problem. We introduce similarity comparison method to determine how the target scale changes, and taking different strategies according to different situation. For target scale getting larger will cause location error, we employ backward tracking to reduce the error. Mean-Shift algorithm has poor performance when tracking scale-changing target due to the fixed bandwidth of its kernel function. In order to overcome this problem, we introduce PCA-SIFT matching. Through key point matching between target and template, that adjusting the scale of tracking window adaptively can be achieved. Because this algorithm is sensitive to wrong match, we introduce RANSAC to reduce mismatch as far as possible. Furthermore target relocating will trigger when number of match is too small. In addition we take comprehensive consideration about target deformation and error accumulation to put forward a new template update method. Experiments on five image sequences and comparison with 6 kinds of other algorithm demonstrate favorable performance of the proposed tracking algorithm.

Keywords: target tracking, PCA-SIFT, mean-shift, scale-adaptive

Procedia PDF Downloads 421