Search results for: Radial Basis Functions (RBF) neural networks
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9465

Search results for: Radial Basis Functions (RBF) neural networks

2835 Affordable Aerodynamic Balance for Instrumentation in a Wind Tunnel Using Arduino

Authors: Pedro Ferreira, Alexandre Frugoli, Pedro Frugoli, Lucio Leonardo, Thais Cavalheri

Abstract:

The teaching of fluid mechanics in engineering courses is, in general, a source of great difficulties for learning. The possibility of the use of experiments with didactic wind tunnels can facilitate the education of future professionals. The objective of this proposal is the development of a low-cost aerodynamic balance to be used in a didactic wind tunnel. The set is comprised of an Arduino microcontroller, programmed by an open source software, linked to load cells built by students from another project. The didactic wind tunnel is 5,0m long and the test area is 90,0 cm x 90,0 cm x 150,0 cm. The Weq® electric motor, model W-22 of 9,2 HP, moves a fan with nine blades, each blade 32,0 cm long. The Weq® frequency inverter, model WEGCFW 08 (Vector Inverter) is responsible for wind speed control and also for the motor inversion of the rotational direction. A flat-convex profile prototype of airfoil was tested by measuring the drag and lift forces for certain attack angles; the air flux conditions remained constant, monitored by a Pitot tube connected to a EXTECH® Instruments digital pressure differential manometer Model HD755. The results indicate a good agreement with the theory. The choice of all of the components of this proposal resulted in a low-cost product providing a high level of specific knowledge of mechanics of fluids, which may be a good alternative to teaching in countries with scarce educational resources. The system also allows the expansion to measure other parameters like fluid velocity, temperature, pressure as well as the possibility of automation of other functions.

Keywords: aerodynamic balance, wind tunnel, strain gauge, load cell, Arduino, low-cost education

Procedia PDF Downloads 445
2834 Load-Enabled Deployment and Sensing Range Optimization for Lifetime Enhancement of WSNs

Authors: Krishan P. Sharma, T. P. Sharma

Abstract:

Wireless sensor nodes are resource constrained battery powered devices usually deployed in hostile and ill-disposed areas to cooperatively monitor physical or environmental conditions. Due to their limited power supply, the major challenge for researchers is to utilize their battery power for enhancing the lifetime of whole network. Communication and sensing are two major sources of energy consumption in sensor networks. In this paper, we propose a deployment strategy for enhancing the average lifetime of a sensor network by effectively utilizing communication and sensing energy to provide full coverage. The proposed scheme is based on the fact that due to heavy relaying load, sensor nodes near to the sink drain energy at much faster rate than other nodes in the network and consequently die much earlier. To cover this imbalance, proposed scheme finds optimal communication and sensing ranges according to effective load at each node and uses a non-uniform deployment strategy where there is a comparatively high density of nodes near to the sink. Probable relaying load factor at particular node is calculated and accordingly optimal communication distance and sensing range for each sensor node is adjusted. Thus, sensor nodes are placed at locations that optimize energy during network operation. Formal mathematical analysis for calculating optimized locations is reported in present work.

Keywords: load factor, network lifetime, non-uniform deployment, sensing range

Procedia PDF Downloads 383
2833 Hydrological Evaluation of Satellite Precipitation Products Using IHACRES Rainfall-Runoff Model over a Basin in Iran

Authors: Mahmoud Zakeri Niri, Saber Moazami, Arman Abdollahipour, Hossein Ghalkhani

Abstract:

The objective of this research is to hydrological evaluation of four widely-used satellite precipitation products named PERSIANN, TMPA-3B42V7, TMPA-3B42RT, and CMORPH over Zarinehrood basin in Iran. For this aim, at first, daily streamflow of Sarough-cahy river of Zarinehrood basin was simulated using IHACRES rainfall-runoff model with daily rain gauge and temperature as input data from 1988 to 2008. Then, the model was calibrated in two different periods through comparison the simulated discharge with the observed one at hydrometric stations. Moreover, in order to evaluate the performance of satellite precipitation products in streamflow simulation, the calibrated model was validated using daily satellite rainfall estimates from the period of 2003 to 2008. The obtained results indicated that TMPA-3B42V7 with CC of 0.69, RMSE of 5.93 mm/day, MAE of 4.76 mm/day, and RBias of -5.39% performs better simulation of streamflow than those PERSIANN and CMORPH over the study area. It is noteworthy that in Iran, the availability of ground measuring station data is very limited because of the sparse density of hydro-meteorological networks. On the other hand, large spatial and temporal variability of precipitations and lack of a reliable and extensive observing system are the most important challenges to rainfall analysis, flood prediction, and other hydrological applications in this country.

Keywords: hydrological evaluation, IHACRES, satellite precipitation product, streamflow simulation

Procedia PDF Downloads 241
2832 Physical Characterization of a Watershed for Correlation with Parameters of Thomas Hydrological Model and Its Application in Iber Hidrodinamic Model

Authors: Carlos Caro, Ernest Blade, Nestor Rojas

Abstract:

This study determined the relationship between basic geo-technical parameters and parameters of the hydro logical model Thomas for water balance of rural watersheds, as a methodological calibration application, applicable in distributed models as IBER model, which represents a distributed system simulation models for unsteady flow numerical free surface. There was an exploration in 25 points (on 15 sub) basin of Rio Piedras (Boy.) obtaining soil samples, to which geo-technical characterization was performed by laboratory tests. Thomas model has a physical characterization of the input area by only four parameters (a, b, c, d). Achieve measurable relationship between geo technical parameters and 4 values of hydro logical parameters helps to determine subsurface, underground and surface flow more agile manner. It is intended in this way to reach some solutions regarding limits initial model parameters on the basis of Thomas geo-technical characterization. In hydro geological models of rural watersheds, calibration is an important process in the characterization of the study area. This step can require a significant computational cost and time, especially if the initial values or parameters before calibration are outside of the geo-technical reality. A better approach in these initial values means optimization of these process through a geo-technical materials area, where is obtained an important approach to the study as in the starting range of variation for the calibration parameters.

Keywords: distributed hydrology, hydrological and geotechnical characterization, Iber model

Procedia PDF Downloads 522
2831 The Effect of Career Decision Self Efficacy on Coping with Career Indecision among Young Adults

Authors: Yuliya Lipshits-Braziler

Abstract:

For many young adults, career decision making is a difficult and complex process that may lead to indecision. Indecision is frequently associated with great psychological distress and low levels of well-being. One important resource for dealing with indecision is career decision self-efficacy (CDSE), which refers to people’s beliefs about their ability to successfully accomplish certain tasks involved in career choice. Drawing from Social Cognitive Theory, it has been hypothesized that CDSE correlates with (a) people’s likelihood to engage in or avoid career decision making tasks, (b) the amount of effort put into the decision making process, (c) the people’s persistence in decision making efforts when faced with difficulties, and (d) the eventual success in arriving at career decisions. Based on these assumptions, the present study examines the associations between the CDSE and 14 strategies for coping with career indecision among young adults. Using the structural equation modeling (SEM), the results showed that CDSE is positively associated with the use of productive coping strategies, such as information-seeking, problem-solving, positive thinking, and self-regulation. In addition, CDSE was negatively associated with nonproductive coping strategies, such as avoidance, isolation, ruminative thinking, and blaming others. Contrary to our expectations, CDSE was not significantly correlated with instrumental help-seeking, while it was negatively correlated with emotional help-seeking. The results of this study can be used to facilitate the development of interventions aiming to reinforce young adults’ career decision making self-efficacy, which may provide them with a basis for overcoming career indecision more effectively.

Keywords: career decision self-efficacy, career indecision, coping strategies, career counseling

Procedia PDF Downloads 256
2830 Classification on Statistical Distributions of a Complex N-Body System

Authors: David C. Ni

Abstract:

Contemporary models for N-body systems are based on temporal, two-body, and mass point representation of Newtonian mechanics. Other mainstream models include 2D and 3D Ising models based on local neighborhood the lattice structures. In Quantum mechanics, the theories of collective modes are for superconductivity and for the long-range quantum entanglement. However, these models are still mainly for the specific phenomena with a set of designated parameters. We are therefore motivated to develop a new construction directly from the complex-variable N-body systems based on the extended Blaschke functions (EBF), which represent a non-temporal and nonlinear extension of Lorentz transformation on the complex plane – the normalized momentum spaces. A point on the complex plane represents a normalized state of particle momentums observed from a reference frame in the theory of special relativity. There are only two key parameters, normalized momentum and nonlinearity for modelling. An algorithm similar to Jenkins-Traub method is adopted for solving EBF iteratively. Through iteration, the solution sets show a form of σ + i [-t, t], where σ and t are the real numbers, and the [-t, t] shows various distributions, such as 1-peak, 2-peak, and 3-peak etc. distributions and some of them are analog to the canonical distributions. The results of the numerical analysis demonstrate continuum-to-discreteness transitions, evolutional invariance of distributions, phase transitions with conjugate symmetry, etc., which manifest the construction as a potential candidate for the unification of statistics. We hereby classify the observed distributions on the finite convergent domains. Continuous and discrete distributions both exist and are predictable for given partitions in different regions of parameter-pair. We further compare these distributions with canonical distributions and address the impacts on the existing applications.

Keywords: blaschke, lorentz transformation, complex variables, continuous, discrete, canonical, classification

Procedia PDF Downloads 309
2829 Assessing Climate-Induced Species Range Shifts and Their Impacts on the Protected Seascape on Canada’s East Coast Using Species Distribution Models and Future Projections

Authors: Amy L. Irvine, Gabriel Reygondeau, Derek P. Tittensor

Abstract:

Marine protected areas (MPAs) within Canada’s exclusive economic zone help ensure the conservation and sustainability of marine ecosystems and the continued provision of ecosystem services to society (e.g., food, carbon sequestration). With ongoing and accelerating climate change, however, MPAs may become undermined in terms of their effectiveness at fulfilling these outcomes. Many populations of species, especially those at their thermal range limits, may shift to cooler waters or become extirpated due to climate change, resulting in new species compositions and ecological interactions within static MPA boundaries. While Canadian MPA management follows international guidelines for marine conservation, no consistent approach exists for adapting MPA networks to climate change and the resulting altered ecosystem conditions. To fill this gap, projected climate-driven shifts in species distributions on Canada’s east coast were analyzed to identify when native species emigrate and novel species immigrate within the network and how high mitigation and carbon emission scenarios influence these timelines. Indicators of the ecological changes caused by these species' shifts in the biological community were also developed. Overall, our research provides projections of climate change impacts and helps to guide adaptive management responses within the Canadian east coast MPA network.

Keywords: climate change, ecosystem modeling, marine protected areas, management

Procedia PDF Downloads 101
2828 A BERT-Based Model for Financial Social Media Sentiment Analysis

Authors: Josiel Delgadillo, Johnson Kinyua, Charles Mutigwe

Abstract:

The purpose of sentiment analysis is to determine the sentiment strength (e.g., positive, negative, neutral) from a textual source for good decision-making. Natural language processing in domains such as financial markets requires knowledge of domain ontology, and pre-trained language models, such as BERT, have made significant breakthroughs in various NLP tasks by training on large-scale un-labeled generic corpora such as Wikipedia. However, sentiment analysis is a strong domain-dependent task. The rapid growth of social media has given users a platform to share their experiences and views about products, services, and processes, including financial markets. StockTwits and Twitter are social networks that allow the public to express their sentiments in real time. Hence, leveraging the success of unsupervised pre-training and a large amount of financial text available on social media platforms could potentially benefit a wide range of financial applications. This work is focused on sentiment analysis using social media text on platforms such as StockTwits and Twitter. To meet this need, SkyBERT, a domain-specific language model pre-trained and fine-tuned on financial corpora, has been developed. The results show that SkyBERT outperforms current state-of-the-art models in financial sentiment analysis. Extensive experimental results demonstrate the effectiveness and robustness of SkyBERT.

Keywords: BERT, financial markets, Twitter, sentiment analysis

Procedia PDF Downloads 152
2827 Hybrid Sol-Gel Coatings for Corrosion Protection of AA6111-T4 Aluminium Alloy

Authors: Shadatul Hanom Rashid, Xiaorong Zhou

Abstract:

Hybrid sol-gel coatings are the blend of both advantages of inorganic and organic networks have been reported as environmentally friendly anti-corrosion surface pre-treatment for several metals, including aluminum alloys. In this current study, Si-Zr hybrid sol-gel coatings were synthesized from (3-glycidoxypropyl)trimethoxysilane (GPTMS), tetraethyl orthosilicate (TEOS) and zirconium(IV) propoxide (TPOZ) precursors and applied on AA6111 aluminum alloy by dip coating technique. The hybrid sol-gel coatings doped with different concentrations of cerium nitrate (Ce(NO3)3) as a corrosion inhibitor were also prepared and the effect of Ce(NO3)3 concentrations on the morphology and corrosion resistance of the coatings were examined. The surface chemistry and morphology of the hybrid sol-gel coatings were analyzed by Fourier transform infrared (FTIR) spectroscopy and scanning electron microscopy (SEM). The corrosion behavior of the coated aluminum alloy samples was evaluated by electrochemical impedance spectroscopy (EIS). Results revealed that good corrosion resistance of hybrid sol-gel coatings were prepared from hydrolysis and condensation reactions of GPTMS, TEOS and TPOZ precursors deposited on AA6111 aluminum alloy. When the coating doped with cerium nitrate, the properties were improved significantly. The hybrid sol-gel coatings containing lower concentration of cerium nitrate offer the best inhibition performance. A proper doping concentration of Ce(NO3)3 can effectively improve the corrosion resistance of the alloy, while an excessive concentration of Ce(NO3)3 would reduce the corrosion protection properties, which is associated with defective morphology and instability of the sol-gel coatings.

Keywords: AA6111, Ce(NO3)3, corrosion, hybrid sol-gel coatings

Procedia PDF Downloads 158
2826 Comparative Study on Structural Behaviour of Circular Hollow Steel Tubular, Concrete Filled Steel Tubular, and Reinforced Cement Concrete Stub Columns under Pure Axial Compression

Authors: Niladri Roy, M. Longshithung Patton

Abstract:

This paper is aimed at studying the structural response of circular hollow steel tubular (HST), concrete filled steel tubular (CFST), and reinforced cement concrete (RCC) stub columns when subjected to only axial compressive forces and also examining their comparative nature using finite element (FE) models. These results are further compared with the respective experimental results. FE software package ABAQUS 6.14 has been used for further parametric studies where a total of 108 FE models were modelled. The diameters of the HST, CFST, and RCC stub columns are kept as 100, 140, 180, and 220, with length to diameter ratio fixed at 3 to avoid end effects and flexural failure. To keep the same percentage of steel (by volume), the thicknesses of steel tubes in HST and CFST columns were varied in response to the change in diameter of the main reinforcement bar in RCC columns. M25 grade of concrete was used throughout. The objective is to compare the structural behaviour of HST, CFST, and RCC stub columns on the basis of their axial compressive load carrying capacity and failure modes. The studies show that filling the circular HST columns with concrete increases the Pu of the CCFST columns by 2.97 times. It was also observed that the Pu (HST) is about 0.72 times Pu (RCC) on average, and the Pu (CFST) is about 2.08 times Pu (RCC) on average. After the analysis and comparison, it has been proved that CFST has much more load carrying capacity than HST and RCC and also provides the same strength at a very less sectional size.

Keywords: HST columns, stub columns, CFST columns, RCC columns, finite element modeling, ABAQUS

Procedia PDF Downloads 100
2825 A Framework for Auditing Multilevel Models Using Explainability Methods

Authors: Debarati Bhaumik, Diptish Dey

Abstract:

Multilevel models, increasingly deployed in industries such as insurance, food production, and entertainment within functions such as marketing and supply chain management, need to be transparent and ethical. Applications usually result in binary classification within groups or hierarchies based on a set of input features. Using open-source datasets, we demonstrate that popular explainability methods, such as SHAP and LIME, consistently underperform inaccuracy when interpreting these models. They fail to predict the order of feature importance, the magnitudes, and occasionally even the nature of the feature contribution (negative versus positive contribution to the outcome). Besides accuracy, the computational intractability of SHAP for binomial classification is a cause of concern. For transparent and ethical applications of these hierarchical statistical models, sound audit frameworks need to be developed. In this paper, we propose an audit framework for technical assessment of multilevel regression models focusing on three aspects: (i) model assumptions & statistical properties, (ii) model transparency using different explainability methods, and (iii) discrimination assessment. To this end, we undertake a quantitative approach and compare intrinsic model methods with SHAP and LIME. The framework comprises a shortlist of KPIs, such as PoCE (Percentage of Correct Explanations) and MDG (Mean Discriminatory Gap) per feature, for each of these three aspects. A traffic light risk assessment method is furthermore coupled to these KPIs. The audit framework will assist regulatory bodies in performing conformity assessments of AI systems using multilevel binomial classification models at businesses. It will also benefit businesses deploying multilevel models to be future-proof and aligned with the European Commission’s proposed Regulation on Artificial Intelligence.

Keywords: audit, multilevel model, model transparency, model explainability, discrimination, ethics

Procedia PDF Downloads 95
2824 Improve Student Performance Prediction Using Majority Vote Ensemble Model for Higher Education

Authors: Wade Ghribi, Abdelmoty M. Ahmed, Ahmed Said Badawy, Belgacem Bouallegue

Abstract:

In higher education institutions, the most pressing priority is to improve student performance and retention. Large volumes of student data are used in Educational Data Mining techniques to find new hidden information from students' learning behavior, particularly to uncover the early symptom of at-risk pupils. On the other hand, data with noise, outliers, and irrelevant information may provide incorrect conclusions. By identifying features of students' data that have the potential to improve performance prediction results, comparing and identifying the most appropriate ensemble learning technique after preprocessing the data, and optimizing the hyperparameters, this paper aims to develop a reliable students' performance prediction model for Higher Education Institutions. Data was gathered from two different systems: a student information system and an e-learning system for undergraduate students in the College of Computer Science of a Saudi Arabian State University. The cases of 4413 students were used in this article. The process includes data collection, data integration, data preprocessing (such as cleaning, normalization, and transformation), feature selection, pattern extraction, and, finally, model optimization and assessment. Random Forest, Bagging, Stacking, Majority Vote, and two types of Boosting techniques, AdaBoost and XGBoost, are ensemble learning approaches, whereas Decision Tree, Support Vector Machine, and Artificial Neural Network are supervised learning techniques. Hyperparameters for ensemble learning systems will be fine-tuned to provide enhanced performance and optimal output. The findings imply that combining features of students' behavior from e-learning and students' information systems using Majority Vote produced better outcomes than the other ensemble techniques.

Keywords: educational data mining, student performance prediction, e-learning, classification, ensemble learning, higher education

Procedia PDF Downloads 108
2823 Coastal Adaptation to Climate Change: A Review of EU Tools, Legislation, National Strategies and Projects in the Mediterranean Basin

Authors: Dimitris Kokkinos, Panagiotis Prinos

Abstract:

In the last three decades, climate change has been studied extensively from scientific community, and its consequences are more than clear all around the world. Most countries have carried out a great effort to reduce global warming rates with the ratification and implementation of several international treaties. Moreover, many of them have already adopted national plans in order to adapt to climate change effects and mitigate human and economic losses. Coastal environments, with their inherent physical sensitivity, will face important challenges as a result of projected changes in climate conditions and hundreds of millions of people will be affected. Coastal zones are of high social and economic value and this research focuses on the Mediterranean basin, which is a densely populated and highly urbanized area. With 40% of its land used for human activity and the inevitability of the impacts of the climate change, it is obvious that some form of adaptation measures will be necessary. In this regard, the EU tools, policies and legislation concerning adaptation to climate change are presented. Additionally, the National Adaptation Strategies of State members of the Mediterranean basin are compared and analyzed concerning the coastal areas, along with an overview of projects and programs results focused on coastal issues at different spatial scales. The purpose of this research is to stress the differences between Mediterranean State members at methodologies implemented, to highlight the possible gaps in co-ordination and to emphasize on research initiatives that EU can build upon moving towards an integrated adaptation planning on a region-wide basis.

Keywords: coastal adaptation, Mediterranean Basin, climate change, coastal environments

Procedia PDF Downloads 308
2822 Patient Service Improvement in Public Emergency Department Using Discrete Event Simulation

Authors: Dana Mohammed, Fatemah Abdullah, Hawraa Ali, Najat Al-Shaer, Rawan Al-Awadhi, , Magdy Helal

Abstract:

We study the patient service performance at the emergency department of a major Kuwaiti public hospital, using discrete simulation and lean concepts. In addition to the common problems in such health care systems (over crowdedness, facilities planning and usage, scheduling and staffing, capacity planning) the emergency department suffered from several cultural and patient behavioural issues. Those contributed significantly to the system problems and constituted major obstacles in maintaining the performance in control. This led to overly long waiting times and the potential of delaying providing help to critical cases. We utilized the visual management tools to mitigate the impact of the patients’ behaviours and attitudes and improve the logistics inside the system. In addition a proposal is made to automate the date collection and communication within the department using RFID-based barcoding system. Discrete event simulation models were developed as decision support systems; to study the operational problems and assess achieved improvements. The simulation analysis resulted in cutting the patient delays to about 35% of their current values by reallocating and rescheduling the medical staff. Combined with the application of the visual management concepts, this provided the basis to improving patient service without any major investments.

Keywords: simulation, visual management, health care system, patient

Procedia PDF Downloads 475
2821 Human Resource Information System: Role in HRM Practices and Organizational Performance

Authors: Ejaz Ali M. Phil

Abstract:

Enterprise Resource Planning (ERP) systems are playing a vital role in effective management of business functions in large and complex organizations. Human Resource Information System (HRIS) is a core module of ERP, providing concrete solutions to implement Human Resource Management (HRM) Practices in an innovative and efficient manner. Over the last decade, there has been considerable increase in the studies on HRIS. Nevertheless, previous studies relatively lacked to examine the moderating role of HRIS in performing HRM practices that may affect the firms’ performance. The current study was carried out to examine the impact of HRM practices (training, performance appraisal) on perceived organizational performance, with moderating role of HRIS, where the system is in place. The study based on Resource Based View (RBV) and Ability Motivation Opportunity (AMO) Theories, advocating that strengthening of human capital enables an organization to achieve and sustain competitive advantage which leads to improved organizational performance. Data were collected through structured questionnaire based upon adopted instruments after establishing reliability and validity. The structural equation modeling (SEM) were used to assess the model fitness, hypotheses testing and to establish validity of the instruments through Confirmatory Factor Analysis (CFA). A total 220 employees of 25 firms in corporate sector were sampled through non-probability sampling technique. Path analysis revealing that HRM practices and HRIS have significant positive impact on organizational performance. The results further showed that the HRIS moderated the relationships between training, performance appraisal and organizational performance. The interpretation of the findings and limitations, theoretical and managerial implications are discussed.

Keywords: enterprise resource planning, human resource, information system, human capital

Procedia PDF Downloads 396
2820 Estimation of Time Loss and Costs of Traffic Congestion: The Contingent Valuation Method

Authors: Amira Mabrouk, Chokri Abdennadher

Abstract:

The reduction of road congestion which is inherent to the use of vehicles is an obvious priority to public authority. Therefore, assessing the willingness to pay of an individual in order to save trip-time is akin to estimating the change in price which was the result of setting up a new transport policy to increase the networks fluidity and improving the level of social welfare. This study holds an innovative perspective. In fact, it initiates an economic calculation that has the objective of giving an estimation of the monetized time value during the trips made in Sfax. This research is founded on a double-objective approach. The aim of this study is to i) give an estimation of the monetized value of time; an hour dedicated to trips, ii) determine whether or not the consumer considers the environmental variables to be significant, iii) analyze the impact of applying a public management of the congestion via imposing taxation of city tolls on urban dwellers. This article is built upon a rich field survey led in the city of Sfax. With the use of the contingent valuation method, we analyze the “declared time preferences” of 450 drivers during rush hours. Based on the fond consideration of attributed bias of the applied method, we bring to light the delicacy of this approach with regards to the revelation mode and the interrogative techniques by following the NOAA panel recommendations bearing the exception of the valorization point and other similar studies about the estimation of transportation externality.

Keywords: willingness to pay, contingent valuation, time value, city toll

Procedia PDF Downloads 434
2819 Analysis and Experimental Research on the Influence of Lubricating Oil on the Transmission Efficiency of New Energy Vehicle Gearbox

Authors: Chen Yong, Bi Wangyang, Zang Libin, Li Jinkai, Cheng Xiaowei, Liu Jinmin, Yu Miao

Abstract:

New energy vehicle power transmission systems continue to develop in the direction of high torque, high speed, and high efficiency. The cooling and lubrication of the motor and the transmission system are integrated, and new requirements are placed on the lubricants for the transmission system. The effects of traditional lubricants and special lubricants for new energy vehicles on transmission efficiency were studied through experiments and simulation methods. A mathematical model of the transmission efficiency of the lubricating oil in the gearbox was established. The power loss of each part was analyzed according to the working conditions. The relationship between the speed and the characteristics of different lubricating oil products on the power loss of the stirring oil was discussed. The minimum oil film thickness was required for the life of the gearbox. The accuracy of the calculation results was verified by the transmission efficiency test conducted on the two-motor integrated test bench. The results show that the efficiency increases first and then decreases with the increase of the speed and decreases with the increase of the kinematic viscosity of the lubricant. The increase of the kinematic viscosity amplifies the transmission power loss caused by the high speed. New energy vehicle special lubricants have less attenuation of transmission efficiency in the range above mid-speed. The research results provide a theoretical basis and guidance for the evaluation and selection of transmission efficiency of gearbox lubricants for new energy vehicles.

Keywords: new energy vehicles, lubricants, transmission efficiency, kinematic viscosity, test and simulation

Procedia PDF Downloads 131
2818 Comparing the Sequence and Effectiveness of Teaching the Four Basic Operations and Mathematics in Primary Schools

Authors: Abubakar Sadiq Mensah, Hassan Usman

Abstract:

The study compared the effectiveness of Audition, Multiplication, subtraction and Division (AMSD) and Addition, subtraction, Multiplication and Division (ASMD), sequence of teaching these four basic operations in mathematics to primary one pupil’s in Katsina Local Government, Katsina State. The study determined the sequence that was more effective and mostly adopted by teachers of the operations. One hundred (100) teachers and sixty pupils (60) from primary one were used for the study. The pupils were divided into two equal groups. The researcher taught these operations to each group separately for four weeks (4 weeks). Group one was taught using the ASMD sequence, while group two was taught using ASMD sequence. In order to generate the needed data for the study, questionnaires and tests were administered on the samples. Data collected were analyzed and major findings were arrived at: (i) Two primary mathematics text books were used in all the primary schools in the area; (ii) Each of the textbooks contained the ASMD sequence; (iii) 73% of the teachers sampled adopted the ASMD sequence of teaching these operations; and (iv) Group one of the pupils (taught using AMSD sequence) performed significantly better than their counter parts in group two (taught using AMSD sequence). On the basis of this, the researcher concluded that the AMSD sequence was more effective in teaching the operations than the ASMD sequence. Consequently, the researcher concluded that primary schools teachers, authors of primary mathematics textbooks, and curriculum planner should adopt the AMSD sequence of teaching these operations.

Keywords: matematic, high school, four basic operations, effectiveness of teaching

Procedia PDF Downloads 253
2817 3D Multiuser Virtual Environments in Language Teaching

Authors: Hana Maresova, Daniel Ecler

Abstract:

The paper focuses on the use of 3D multi-user virtual environments (MUVE) in language teaching and presents the results of four years of research at the Faculty of Education, Palacký University in Olomouc (Czech Republic). In the form of an experiment, mother tongue language teaching in the 3D virtual worlds Second Life and Kitely (experimental group) and parallel traditional teaching on identical topics representing teacher's interpretation using a textbook (control group) were implemented. The didactic test, which was presented to the experimental and control groups in an identical form before and after the instruction, verified the effect of the instruction in the experimental group by comparing the results obtained by both groups. Within the three components of mother-tongue teaching (vocabulary, literature, style and communication education), the students in the literature group achieved partially better results (statistically significant in the case of items devoted to the area of visualization of the learning topic), while in the case of grammar and style education the respondents of the control group achieved better results. On the basis of the results obtained, we can conclude that the most appropriate use of MUVE can be seen in the teaching of those topics that provide the possibility of dramatization, experiential learning and group involvement and cooperation, on the contrary, with regard to the need to divide students attention between the topic taught and the control of avatar and movement in virtual reality as less suitable for teaching in the area of memorization of the topic or concepts.

Keywords: distance learning, 3D virtual environments, online teaching, language teaching

Procedia PDF Downloads 163
2816 A Short Survey of Integrating Urban Agriculture and Environmental Planning

Authors: Rayeheh Khatami, Toktam Hanaei, Mohammad Reza Mansouri Daneshvar

Abstract:

The growth of the agricultural sector is known as an essential way to achieve development goals in developing countries. Urban agriculture is a way to reduce the vulnerability of urban populations of the world toward global environmental change. It is a sustainable and efficient system to respond to the environmental, social and economic needs of the city, which leads to urban sustainability. Today, many local and national governments are developing urban agriculture as an effective tool in responding to challenges such as poverty, food security, and environmental problems. In this study, we follow a perspective based on urban agriculture literature in order to indicate the urban agriculture’s benefits in environmental planning strategies in non-western countries like Iran. The methodological approach adopted is based on qualitative approach and documentary studies. A total of 35 articles (mixed quantitative and qualitative methods studies) were studied in final analysis, which are published in relevant journals that focus on this subject. Studies show the wide range of positive benefits of urban agriculture on food security, nutrition outcomes, health outcomes, environmental outcomes, and social capital. However, there was no definitive conclusion about the negative effects of urban agriculture. This paper provides a conceptual and theoretical basis to know about urban agriculture and its roles in environmental planning, and also conclude the benefits of urban agriculture for researchers, practitioners, and policymakers who seek to create spaces in cities for implementation urban agriculture in future.

Keywords: urban agriculture, environmental planning, urban planning, literature

Procedia PDF Downloads 144
2815 Measurement of Project Success in Construction Using Performance Indices

Authors: Annette Joseph

Abstract:

Background: The construction industry is dynamic in nature owing to the increasing uncertainties in technology, budgets, and development processes making projects more complex. Thus, predicting project performance and chances of its likely success has become difficult. The goal of all parties involved in construction projects is to successfully complete it on schedule, within planned budget and with the highest quality and in the safest manner. However, the concept of project success has remained ambiguously defined in the mind of the construction professionals. Purpose: This paper aims to study the analysis of a project in terms of its performance and measure the success. Methodology: The parameters for evaluating project success and the indices to measure success/performance of a project are identified through literature study. Through questionnaire surveys aimed at the stakeholders in the projects, data is collected from two live case studies (an ongoing and completed project) on the overall performance in terms of its success/failure. Finally, with the help of SPSS tool, the data collected from the surveys are analyzed and applied on the selected performance indices. Findings: The score calculated by using the indices and models helps in assessing the overall performance of the project and interpreting it to find out whether the project will be a success or failure. This study acts as a reference for firms to carry out performance evaluation and success measurement on a regular basis helping projects to identify the areas which are performing well and those that require improvement. Originality & Value: The study signifies that by measuring project performance; a project’s deviation towards success/failure can be assessed thus helping in suggesting early remedial measures to bring it on track ensuring that a project will be completed successfully.

Keywords: project, performance, indices, success

Procedia PDF Downloads 191
2814 Divergences in Interpreters’ Oral Interpretation among Pentecostal Churches: Sermonic Reflections

Authors: Rufus Olufemi Adebayo, Sylvia Phiwani Zulu

Abstract:

Interpreting in the setting of diverse language and multicultural congregants, is often understood as integrating the content of the message. Preaching, similar to any communication, takes seriously people’s multiple contexts. The one who provides the best insight into understanding “the other”, traditionally speaking could be an interpreter in a multilingual context. Nonetheless, there are reflections in the loss of spiritual communication, translation and interpretive dialogue. No matter how eloquent the preacher is, an interpreter can make or mere the sermon (speech). The sermon that the preacher preaches is not always the one the congregation hears from the interpreter. In other occurrences, however, interpreting can lead not only to distort messages but also to dissatisfied audiences and preacher being overshadowed by the pranks of the interpreter. Using qualitative methodology, this paper explores the challenges and the conventional assumptions about preachers’ interpreter as influenced by spirituality, culture, and language in empirical and theoretical perspectives. An emphasis on the bias translation and the basis of reality that suppresses or devalues the spiritual communication is examined. The result indicates that interpretation of the declaration of guilt, history of congregation, spirituality, attitudes, morals, customs, specific practices of a preacher, education, and the environment form an entangled and misinterpretation. The article concludes by re-examining these qualities and rearticulating them into a preliminary theory for practice, as distinguished from theory, which could possibly enhance the development of more sustainable multilingual interpretation in the South African Pentecostal churches.

Keywords: congregants, divergences, interpreting/translation, language & communication, sermon/preaching

Procedia PDF Downloads 166
2813 Financial Liberalization and Allocation of Bank Credit in Malaysia

Authors: Chow Fah Yee, Eu Chye Tan

Abstract:

The main purpose of developing a modern and sophisticated financial system is to mobilize and allocate the country’s resources for productive uses and in the process contribute to economic growth. Financial liberalization introduced in Malaysia in 1978 was said to be a step towards this goal. According to Mc-Kinnon and Shaw, the deregulation of a country’s financial system will create a more efficient and competitive market driven financial sector; with savings being channelled to the most productive users. This paper aims to assess whether financial liberalization resulted in bank credit being allocated to the more productive users, for the case of Malaysia by: firstly, using Chi-square test to if there exists a relationship between financial liberalization and bank lending in Malaysia. Secondly, to analyze on a comparative basis, the share of loans secured by 9 major economic sectors, using data on bank loans from 1975 to 2003. Lastly, present value analysis and rank correlation was used to determine if the recipients of bigger loans are the more efficient users. Chi-square test confirmed the generally observed trend of an increase in bank credit with the adoption of financial liberalization. While the comparative analysis of loans showed that the bulk of credit were allocated to service sectors, consumer loans and property related sectors, at the expense of industry. Results for rank correlation analysis showed that there is no relationship between the more productive users and amount of loans obtained. This implies that the recipients (sectors) that received more loans were not the more efficient sectors.

Keywords: allocation of resources, bank credit, financial liberalization, economics

Procedia PDF Downloads 446
2812 Stochastic Edge Based Anomaly Detection for Supervisory Control and Data Acquisitions Systems: Considering the Zambian Power Grid

Authors: Lukumba Phiri, Simon Tembo, Kumbuso Joshua Nyoni

Abstract:

In Zambia recent initiatives by various power operators like ZESCO, CEC, and consumers like the mines to upgrade power systems into smart grids target an even tighter integration with information technologies to enable the integration of renewable energy sources, local and bulk generation, and demand response. Thus, for the reliable operation of smart grids, its information infrastructure must be secure and reliable in the face of both failures and cyberattacks. Due to the nature of the systems, ICS/SCADA cybersecurity and governance face additional challenges compared to the corporate networks, and critical systems may be left exposed. There exist control frameworks internationally such as the NIST framework, however, there are generic and do not meet the domain-specific needs of the SCADA systems. Zambia is also lagging in cybersecurity awareness and adoption, therefore there is a concern about securing ICS controlling key infrastructure critical to the Zambian economy as there are few known facts about the true posture. In this paper, we introduce a stochastic Edged-based Anomaly Detection for SCADA systems (SEADS) framework for threat modeling and risk assessment. SEADS enables the calculation of steady-steady probabilities that are further applied to establish metrics like system availability, maintainability, and reliability.

Keywords: anomaly, availability, detection, edge, maintainability, reliability, stochastic

Procedia PDF Downloads 110
2811 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)

Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton

Abstract:

Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.

Keywords: cold-start learning, expectation propagation, multi-armed bandits, Thompson Sampling, variational inference

Procedia PDF Downloads 108
2810 The Need for a Consistent Regulatory Framework for CRISPR Gene-Editing in the European Union

Authors: Andrew Thayer, Courtney Rondeau, Paraskevi Papadopoulou

Abstract:

The Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR) gene-editing technologies have generated considerable discussion about the applications and ethics of their use. However, no consistent guidelines for using CRISPR technologies have been developed -nor common legislation passed related to gene editing, especially as it is connected to genetically modified organisms (GMOs) in the European Union. The recent announcement that the first babies with CRISPR-edited genes were born, along with new studies exploring CRISPR’s applications in treating thalassemia, sickle-cell anemia, cancer, and certain forms of blindness, have demonstrated that the technology is developing faster than the policies needed to control it. Therefore, it can be seen that a reasonable and coherent regulatory framework for the use of CRISPR in human somatic and germline cells is necessary to ensure the ethical use of the technology in future years. The European Union serves as a unique region of interconnected countries without a standard set of regulations or legislation for CRISPR gene-editing. We posit that the EU would serve as a suitable model in comparing the legislations of its affiliated countries in order to understand the practicality and effectiveness of adopting majority-approved practices. Additionally, we present a proposed set of guidelines which could serve as a basis in developing a consistent regulatory framework for the EU countries to implement but also act as a good example for other countries to adhere to. Finally, an additional, multidimensional framework of smart solutions is proposed with which all stakeholders are engaged to become better-informed citizens.

Keywords: CRISPR, ethics, regulatory framework, European legislation

Procedia PDF Downloads 135
2809 A Hybrid-Evolutionary Optimizer for Modeling the Process of Obtaining Bricks

Authors: Marius Gavrilescu, Sabina-Adriana Floria, Florin Leon, Silvia Curteanu, Costel Anton

Abstract:

Natural sciences provide a wide range of experimental data whose related problems require study and modeling beyond the capabilities of conventional methodologies. Such problems have solution spaces whose complexity and high dimensionality require correspondingly complex regression methods for proper characterization. In this context, we propose an optimization method which consists in a hybrid dual optimizer setup: a global optimizer based on a modified variant of the popular Imperialist Competitive Algorithm (ICA), and a local optimizer based on a gradient descent approach. The ICA is modified such that intermediate solution populations are more quickly and efficiently pruned of low-fitness individuals by appropriately altering the assimilation, revolution and competition phases, which, combined with an initialization strategy based on low-discrepancy sampling, allows for a more effective exploration of the corresponding solution space. Subsequently, gradient-based optimization is used locally to seek the optimal solution in the neighborhoods of the solutions found through the modified ICA. We use this combined approach to find the optimal configuration and weights of a fully-connected neural network, resulting in regression models used to characterize the process of obtained bricks using silicon-based materials. Installations in the raw ceramics industry, i.e., bricks, are characterized by significant energy consumption and large quantities of emissions. Thus, the purpose of our approach is to determine by simulation the working conditions, including the manufacturing mix recipe with the addition of different materials, to minimize the emissions represented by CO and CH4. Our approach determines regression models which perform significantly better than those found using the traditional ICA for the aforementioned problem, resulting in better convergence and a substantially lower error.

Keywords: optimization, biologically inspired algorithm, regression models, bricks, emissions

Procedia PDF Downloads 82
2808 First-Principles Calculations of Hydrogen Adsorbed in Multi-Layer Graphene

Authors: Mohammad Shafiul Alam, Mineo Saito

Abstract:

Graphene-based materials have attracted much attention because they are candidates for post silicon materials. Since controlling of impurities is necessary to achieve nano device, we study hydrogen impurity in multi-layer graphene. We perform local spin Density approximation (LSDA) in which the plane wave basis set and pseudopotential are used. Previously hydrogen monomer and dimer in graphene is well theoretically studied. However, hydrogen on multilayer graphene is still not clear. By using first-principles electronic structure calculations based on the LSDA within the density functional theory method, we studied hydrogen monomers and dimers in two-layer graphene. We found that the monomers are spin-polarized and have magnetic moment 1 µB. We also found that most stable dimer is much more stable than monomer. In the most stable structures of the dimers in two-layer graphene, the two hydrogen atoms are bonded to the host carbon atoms which are nearest-neighbors. In this case two hydrogen atoms are located on the opposite sides. Whereas, when the two hydrogen atoms are bonded to the same sublattice of the host materials, magnetic moments of 2 µB appear in two-layer graphene. We found that when the two hydrogen atoms are bonded to third-nearest-neighbor carbon atoms, the electronic structure is nonmagnetic. We also studied hydrogen monomers and dimers in three-layer graphene. The result is same as that of two-layer graphene. These results are very important in the field of carbon nanomaterials as it is experimentally difficult to show the magnetic state of those materials.

Keywords: first-principles calculations, LSDA, multi-layer gra-phene, nanomaterials

Procedia PDF Downloads 331
2807 Hybrid CNN-SAR and Lee Filtering for Enhanced InSAR Phase Unwrapping and Coherence Optimization

Authors: Hadj Sahraoui Omar, Kebir Lahcen Wahib, Bennia Ahmed

Abstract:

Interferometric Synthetic Aperture Radar (InSAR) coherence is a crucial parameter for accurately monitoring ground deformation and environmental changes. However, coherence can be degraded by various factors such as temporal decorrelation, atmospheric disturbances, and geometric misalignments, limiting the reliability of InSAR measurements (Omar Hadj‐Sahraoui and al. 2019). To address this challenge, we propose an innovative hybrid approach that combines artificial intelligence (AI) with advanced filtering techniques to optimize interferometric coherence in InSAR data. Specifically, we introduce a Convolutional Neural Network (CNN) integrated with the Lee filter to enhance the performance of radar interferometry. This hybrid method leverages the strength of CNNs to automatically identify and mitigate the primary sources of decorrelation, while the Lee filter effectively reduces speckle noise, improving the overall quality of interferograms. We develop a deep learning-based model trained on multi-temporal and multi-frequency SAR datasets, enabling it to predict coherence patterns and enhance low-coherence regions. This hybrid CNN-SAR with Lee filtering significantly reduces noise and phase unwrapping errors, leading to more precise deformation maps. Experimental results demonstrate that our approach improves coherence by up to 30% compared to traditional filtering techniques, making it a robust solution for challenging scenarios such as urban environments, vegetated areas, and rapidly changing landscapes. Our method has potential applications in geohazard monitoring, urban planning, and environmental studies, offering a new avenue for enhancing InSAR data reliability through AI-powered optimization combined with robust filtering techniques.

Keywords: CNN-SAR, Lee Filter, hybrid optimization, coherence, InSAR phase unwrapping, speckle noise reduction

Procedia PDF Downloads 12
2806 Revealing the Manufacturing Techniques of the Leather Scale Armour of Tutankhamun by the Assist of Conservation Procedures

Authors: Safwat Mohamed, Rasha Metawi, Hadeel Khalil, Hussein Kamal

Abstract:

This paper discusses and reveals the manufacturing techniques of the leather scale armour of Tutankhamun. This armour was in critical condition and went under many conservation procedures as it suffered from some serious deterioration aspects including fragmentation. In addition, its original shape was lost, the leather scales were found scattered in the box and separated from the linen basis, and hence its outlines were blurred and incomprehensible. In view of this, the leather scale armour of Tutankhamun was desperate for urgent conservation and reconstruction interventions. Documentation measures were done before conservation. Several re-treatable conservation procedures were applied seeking for stabilizing the armour and reaching sustainable condition. The conservation treatments included many investigations and analyses that helped in revealing materials and techniques of making the armour. The leather scale armour of Tutankhamun consisted of leather scales attached to a linen support. This linen support consisted of several layers. Howard Carter assumed that the linen support consisted of 6 layers. The undertaken conservation treatments helped in revealing the actual number of layers of the linen support as well as in reaching the most sustainable condition. This paper views the importance of the conservation procedures, which were recently carried out on Tutankhamun’s leather scale armour, in identifying and revealing all materials and techniques used in its manufacturing. The collected data about manufacturing techniques were used in making a replica of the leather scale armour with the same methods and materials.

Keywords: leather scales armours, conservation, manufacturing techniques, Tutankhamun, producing a replica

Procedia PDF Downloads 100