Search results for: e2e reliability prediction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4116

Search results for: e2e reliability prediction

1266 Approximation of Geodesics on Meshes with Implementation in Rhinoceros Software

Authors: Marian Sagat, Mariana Remesikova

Abstract:

In civil engineering, there is a problem how to industrially produce tensile membrane structures that are non-developable surfaces. Nondevelopable surfaces can only be developed with a certain error and we want to minimize this error. To that goal, the non-developable surfaces are cut into plates along to the geodesic curves. We propose a numerical algorithm for finding approximations of open geodesics on meshes and surfaces based on geodesic curvature flow. For practical reasons, it is important to automatize the choice of the time step. We propose a method for automatic setting of the time step based on the diagonal dominance criterion for the matrix of the linear system obtained by discretization of our partial differential equation model. Practical experiments show reliability of this method. Because approximation of the model is made by numerical method based on classic derivatives, it is necessary to solve obstacles which occur for meshes with sharp corners. We solve this problem for big family of meshes with sharp corners via special rotations which can be seen as partial unfolding of the mesh. In practical applications, it is required that the approximation of geodesic has its vertices only on the edges of the mesh. This problem is solved by a specially designed pointing tracking algorithm. We also partially solve the problem of finding geodesics on meshes with holes. We implemented the whole algorithm in Rhinoceros (commercial 3D computer graphics and computer-aided design software ). It is done by using C# language as C# assembly library for Grasshopper, which is plugin in Rhinoceros.

Keywords: geodesic, geodesic curvature flow, mesh, Rhinoceros software

Procedia PDF Downloads 153
1265 A Multi-Agent Smart E-Market Design at Work for Shariah Compliant Islamic Banking

Authors: Wafa Ghonaim

Abstract:

Though quite fast on growth, Islamic financing at large, and its diverse instruments, is a controversial matter among scholars. This is evident from the ongoing debates on its Shariah compliance. Arguments, however, are inciting doubts and concerns among clients about its credibility, which is harming this lucrative sector. The work here investigates, particularly, some issues related to the Tawarruq instrument. The work examines the issues of linking Murabaha and Wakala contracts, the reselling of commodities to same traders, and the transfer of ownerships. The work affirms that a multi-agent smart electronic market design would facilitate Shariah compliance. The smart market exploits the rational decision-making capabilities of autonomous proxy agents that enable the clients, traders, brokers, and the bank buy and sell commodities, and manage transactions and cash flow. The smart electronic market design delivers desirable qualities that terminate the need for Wakala contracts and the reselling of commodities to the same traders. It also resolves the ownership transfer issues by allowing stakeholders to trade independently. The bank administers the smart electronic market and assures reliability of trades, transactions and cash flow. A multi-agent simulation is presented to validate the concept and processes. We anticipate that the multi-agent smart electronic market design would deliver Shariah compliance of personal financing to the aspiration of scholars, banks, traders and potential clients.

Keywords: Islamic finance, share'ah compliance, smart electronic markets design, multiagent systems

Procedia PDF Downloads 318
1264 Impacts of Sociological Dynamics on Entomophagy Practice and Food Security in Nigeria

Authors: O. B. Oriolowo, O. J. John

Abstract:

Empirical findings have shown insects to be nutritious and good source of food for man. However, human food preferences are not only determined by nutritional values of food consumed but, more importantly, by sociology and economic pressure. This study examined the interrelation between science and sociology in sustaining the acceptance of entomophagy among college students to combat food insecurity. A twenty items five Likert scale, College Students Entomophagy Questionnaire (CSEQ), was used to elucidate information from the respondents. The reliability coefficient was obtained to be 0.91 using Spearman-Brown Prophecy formula. Three research questions and three hypotheses were raised. Also, quantitative nutritional analysis of few insects and some established conventional protein sources were undertaking in order to compare their nutritional status. The data collected were analyzed using descriptive statistics of percentages and inferential statistics of correlation and Analysis of Variance (ANOVA). The results obtained showed that entomophagy has cultural heritage among different tribes in Nigeria and is an acceptable practice; it cuts across every social stratum and is practiced among both major religions. Moreover, insects compared favourably in term of nutrient contents when compared with the conventional animal protein sources analyzed. However, there is a gradual decline in the practice of entomophagy among students, which may be attributed to the influence of western civilization. This study, therefore, recommended an intensification of research and enlightenment of people on the usefulness of entomophagy so as to preserve its cultural heritage as well as boost human food security.

Keywords: entomophagy, food security, malnutrition, poverty alleviation, sociology

Procedia PDF Downloads 121
1263 Feature Based Unsupervised Intrusion Detection

Authors: Deeman Yousif Mahmood, Mohammed Abdullah Hussein

Abstract:

The goal of a network-based intrusion detection system is to classify activities of network traffics into two major categories: normal and attack (intrusive) activities. Nowadays, data mining and machine learning plays an important role in many sciences; including intrusion detection system (IDS) using both supervised and unsupervised techniques. However, one of the essential steps of data mining is feature selection that helps in improving the efficiency, performance and prediction rate of proposed approach. This paper applies unsupervised K-means clustering algorithm with information gain (IG) for feature selection and reduction to build a network intrusion detection system. For our experimental analysis, we have used the new NSL-KDD dataset, which is a modified dataset for KDDCup 1999 intrusion detection benchmark dataset. With a split of 60.0% for the training set and the remainder for the testing set, a 2 class classifications have been implemented (Normal, Attack). Weka framework which is a java based open source software consists of a collection of machine learning algorithms for data mining tasks has been used in the testing process. The experimental results show that the proposed approach is very accurate with low false positive rate and high true positive rate and it takes less learning time in comparison with using the full features of the dataset with the same algorithm.

Keywords: information gain (IG), intrusion detection system (IDS), k-means clustering, Weka

Procedia PDF Downloads 296
1262 Numerical Simulation of the Flowing of Ice Slurry in Seawater Pipe of Polar Ships

Authors: Li Xu, Huanbao Jiang, Zhenfei Huang, Lailai Zhang

Abstract:

In recent years, as global warming, the sea-ice extent of North Arctic undergoes an evident decrease and Arctic channel has attracted the attention of shipping industry. Ice crystals existing in the seawater of Arctic channel which enter the seawater system of the ship with the seawater were found blocking the seawater pipe. The appearance of cooler paralysis, auxiliary machine error and even ship power system paralysis may be happened if seriously. In order to reduce the effect of high temperature in auxiliary equipment, seawater system will use external ice-water to participate in the cooling cycle and achieve the state of its flow. The distribution of ice crystals in seawater pipe can be achieved. As the ice slurry system is solid liquid two-phase system, the flow process of ice-water mixture is very complex and diverse. In this paper, the flow process in seawater pipe of ice slurry is simulated with fluid dynamics simulation software based on k-ε turbulence model. As the ice packing fraction is a key factor effecting the distribution of ice crystals, the influence of ice packing fraction on the flowing process of ice slurry is analyzed. In this work, the simulation results show that as the ice packing fraction is relatively large, the distribution of ice crystals is uneven in the flowing process of the seawater which has such disadvantage as increase the possibility of blocking, that will provide scientific forecasting methods for the forming of ice block in seawater piping system. It has important significance for the reliability of the operating of polar ships in the future.

Keywords: ice slurry, seawater pipe, ice packing fraction, numerical simulation

Procedia PDF Downloads 368
1261 A Framework Based on Dempster-Shafer Theory of Evidence Algorithm for the Analysis of the TV-Viewers’ Behaviors

Authors: Hamdi Amroun, Yacine Benziani, Mehdi Ammi

Abstract:

In this paper, we propose an approach of detecting the behavior of the viewers of a TV program in a non-controlled environment. The experiment we propose is based on the use of three types of connected objects (smartphone, smart watch, and a connected remote control). 23 participants were observed while watching their TV programs during three phases: before, during and after watching a TV program. Their behaviors were detected using an approach based on The Dempster Shafer Theory (DST) in two phases. The first phase is to approximate dynamically the mass functions using an approach based on the correlation coefficient. The second phase is to calculate the approximate mass functions. To approximate the mass functions, two approaches have been tested: the first approach was to divide each features data space into cells; each one has a specific probability distribution over the behaviors. The probability distributions were computed statistically (estimated by empirical distribution). The second approach was to predict the TV-viewing behaviors through the use of classifiers algorithms and add uncertainty to the prediction based on the uncertainty of the model. Results showed that mixing the fusion rule with the computation of the initial approximate mass functions using a classifier led to an overall of 96%, 95% and 96% success rate for the first, second and third TV-viewing phase respectively. The results were also compared to those found in the literature. This study aims to anticipate certain actions in order to maintain the attention of TV viewers towards the proposed TV programs with usual connected objects, taking into account the various uncertainties that can be generated.

Keywords: Iot, TV-viewing behaviors identification, automatic classification, unconstrained environment

Procedia PDF Downloads 229
1260 Cultural Self-Efficacy of Child Protection Social Workers in Norway: Barriers and Opportunities in Working with Migrant Families

Authors: Justyna Mroczkowska

Abstract:

Social worker's ability to provide culturally sensitive assistance in child protection is taken for granted; given limited training opportunities and lack of clear guidance, practitioners report working with migrant families more demanding in comparison to working with native families. In this study, the author developed and factor analyzed the Norwegian Cultural Self-Efficacy Scale to describe the level of cultural capability among Norwegian child protection professionals. The study aimed to determine the main influencing factors to cultural efficacy and examine the relationship between self-efficacy and perceived difficulty in working with migrant families. The scale was administered to child protection workers in Norway (N=251), and the reliability of the scale measured by Cronbach's alpha coefficient was .904. The confirmatory factor analysis of social work cultural self-efficacy found support for four separate but correlated subscales: Assessment, Communication, Support Request, and Teamwork. Regression analyses found the experience in working with migrant families, training and support from external agencies, and colleague support to be significant predictors of cultural self-efficacy. Self-efficacy in assessment skills and self-efficacy in communication skills were moderately related to the perceived difficulty to work with migrant families. The findings conclude with previous research and highlight the need for both professional development programs and institutional resources to be provided to support the practitioner's preparation for multicultural practice in child protection.

Keywords: child protection, cultural self-efficacy, cultural competency, migration, resources

Procedia PDF Downloads 143
1259 Monocular Depth Estimation Benchmarking with Thermal Dataset

Authors: Ali Akyar, Osman Serdar Gedik

Abstract:

Depth estimation is a challenging computer vision task that involves estimating the distance between objects in a scene and the camera. It predicts how far each pixel in the 2D image is from the capturing point. There are some important Monocular Depth Estimation (MDE) studies that are based on Vision Transformers (ViT). We benchmark three major studies. The first work aims to build a simple and powerful foundation model that deals with any images under any condition. The second work proposes a method by mixing multiple datasets during training and a robust training objective. The third work combines generalization performance and state-of-the-art results on specific datasets. Although there are studies with thermal images too, we wanted to benchmark these three non-thermal, state-of-the-art studies with a hybrid image dataset which is taken by Multi-Spectral Dynamic Imaging (MSX) technology. MSX technology produces detailed thermal images by bringing together the thermal and visual spectrums. Using this technology, our dataset images are not blur and poorly detailed as the normal thermal images. On the other hand, they are not taken at the perfect light conditions as RGB images. We compared three methods under test with our thermal dataset which was not done before. Additionally, we propose an image enhancement deep learning model for thermal data. This model helps extract the features required for monocular depth estimation. The experimental results demonstrate that, after using our proposed model, the performance of these three methods under test increased significantly for thermal image depth prediction.

Keywords: monocular depth estimation, thermal dataset, benchmarking, vision transformers

Procedia PDF Downloads 34
1258 Investigating The Effect Of Convection On The Rating Of Buried Cables Using The Finite Element Method

Authors: Sandy J. M. Balla, Jerry J. Walker, Isaac K. Kyere

Abstract:

The heat transfer coefficient at the soil–air interface is important in calculating underground cable ampacity when convection occurs. Calculating the heat transfer coefficient accurately is complex because of the temperature variations at the earth's surface. This paper presents the effect of convection heat flow across the ground surface on the rating of three single-core, 132kV, XLPE cables buried underground. The Finite element method (FEM) is a numerical analysis technique used to determine the cable rating of buried cables under installation conditions that are difficult to support when using the analytical method. This study demonstrates the use of FEM to investigate the effect of convection on the rating ofburied cables in flat formation using QuickField finite element simulation software. As a result, developing a model to simulate this type of situation necessitates important considerations such as the following boundary conditions: burial depth, soil thermal resistivity, and soil temperature, which play an important role in the simulation's accuracy and reliability. The results show that when the ground surface is taken as a convection interface, the conductor temperature rises and may exceed the maximum permissible temperature when rated current flows. This is because the ground surface acts as a convection interface between the soil and the air (fluid). This result correlates and is compared with the rating obtained using the IEC60287 analytical method, which is based on the condition that the ground surface is an isotherm.

Keywords: finite element method, convection, buried cables, steady-state rating

Procedia PDF Downloads 131
1257 Factors Influencing Site Overhead Cost of Construction Projects in Egypt: A Comparative Analysis

Authors: Aya Effat, Ossama A. Hosny, Elkhayam M. Dorra

Abstract:

Estimating costs is a crucial step in construction management and should be completed at the beginning of every project to establish the project's budget. The precision of the cost estimate plays a significant role in the success of construction projects as it allows project managers to effectively manage the project's costs. Site overhead costs constitute a significant portion of construction project budgets, necessitating accurate prediction and management. These costs are influenced by a multitude of factors, requiring a thorough examination and analysis to understand their relative importance and impact. Thus, the main aim of this research is to enhance the contractor’s ability to predict and manage site overheads by identifying and analyzing the main factors influencing the site overheads costs in the Egyptian construction industry. Through a comprehensive literature review, key factors were first identified and subsequently validated using a thorough comparative analysis of data from 55 real-life construction projects. Through this comparative analysis, the relationship between each factor and site overheads percentage as well as each site overheads subcategory and each project construction phase was identified and examined. Furthermore, correlation analysis was done to check for multicollinearity and identify factors with the highest impact. The findings of this research offer valuable insights into the key drivers of site overhead costs in the Egyptian construction industry. By understanding these factors, construction professionals can make informed decisions regarding the estimation and management of site overhead costs.

Keywords: comparative analysis, cost estimation, construction management, site overheads

Procedia PDF Downloads 22
1256 Factors that Predict Pre-Service Teachers' Decision to Integrate E-Learning: A Structural Equation Modeling (SEM) Approach

Authors: Mohd Khairezan Rahmat

Abstract:

Since the impetus of becoming a develop country by the year 2020, the Malaysian government have been proactive in strengthening the integration of ICT into the national educational system. Teacher-education programs have the responsibility to prepare the nation future teachers by instilling in them the desire, confidence, and ability to fully utilized the potential of ICT into their instruction process. In an effort to fulfill this responsibility, teacher-education program are beginning to create alternatives means for preparing cutting-edge teachers. One of the alternatives is the student’s learning portal. In line with this mission, this study investigates the Faculty of Education, University Teknologi MARA (UiTM) pre-service teachers’ perception of usefulness, attitude, and ability toward the usage of the university learning portal, known as iLearn. The study also aimed to predict factors that might hinder the pre-service teachers’ decision to used iLearn as their platform in learning. The Structural Equation Modeling (SEM), was employed in analyzed the survey data. The suggested findings informed that pre-service teacher’s successful integration of the iLearn was highly influenced by their perception of usefulness of the system. The findings also suggested that the more familiar the pre-service teacher with the iLearn, the more possibility they will use the system. In light of similar study, the present findings hope to highlight the important to understand the user’s perception toward any proposed technology.

Keywords: e-learning, prediction factors, pre-service teacher, structural equation modeling (SEM)

Procedia PDF Downloads 340
1255 An Criterion to Minimize FE Mesh-Dependency in Concrete Plate Subjected to Impact Loading

Authors: Kwak, Hyo-Gyung, Gang, Han Gul

Abstract:

In the context of an increasing need for reliability and safety in concrete structures under blast and impact loading condition, the behavior of concrete under high strain rate condition has been an important issue. Since concrete subjected to impact loading associated with high strain rate shows quite different material behavior from that in the static state, several material models are proposed and used to describe the high strain rate behavior under blast and impact loading. In the process of modelling, in advance, mesh dependency in the used finite element (FE) is the key problem because simulation results under high strain-rate condition are quite sensitive to applied FE mesh size. It means that the accuracy of simulation results may deeply be dependent on FE mesh size in simulations. This paper introduces an improved criterion which can minimize the mesh-dependency of simulation results on the basis of the fracture energy concept, and HJC (Holmquist Johnson Cook), CSC (Continuous Surface Cap) and K&C (Karagozian & Case) models are examined to trace their relative sensitivity to the used FE mesh size. To coincide with the purpose of the penetration test with a concrete plate under a projectile (bullet), the residual velocities of projectile after penetration are compared. The correlation studies between analytical results and the parametric studies associated with them show that the variation of residual velocity with the used FE mesh size is quite reduced by applying a unique failure strain value determined according to the proposed criterion.

Keywords: high strain rate concrete, penetration simulation, failure strain, mesh-dependency, fracture energy

Procedia PDF Downloads 522
1254 Developing an Instrument to Measure Teachers’ Self-Efficacy of Teaching Innovation Skills

Authors: Huda S. Al-Azmi

Abstract:

There is a growing consensus that adoption of teachers’ self-efficacy measurement tools help to assess teachers’ abilities in specific areas in order to improve their skills. As a result, different instruments to assess teachers’ ability were developed by academics and practitioners. However, many of these instruments focused either on general teaching skills, or on the other hand, were very specific to one subject. As such, these instruments do not offer a tool to measure the ability of teachers in teaching 21st century skills such as innovation skills. Teaching innovation skills helps to prepare students for lives and careers in the 21st century. The purpose of this study is to develop an instrument measuring teachers’ self-efficacy of teaching innovation skills related to the classroom context and evaluating the teachers’ beliefs regarding their ability in teaching innovation skills. To reach this goal, the 16-item instrument measures four dimensions of innovation skills: creativity, critical thinking, communication, and collaboration. 211 secondary-school teachers filled out the survey to quantitatively analyze the quality of the instrument. The instrument’s reliability and item analysis were measured by using jMetrik. The results concluded that the mean of self-efficacy ranged from 3 to 3.6 without extreme high or low self-efficacy scores. The discrimination analysis revealed that one item recorded a negative correlation with the total, and three items recorded low correlation with the total. The reliabilities of items ranged from 0.64 to 0.69 and the instrument needed a couple of revisions before practical use. The study concluded the need to discard one item and revise five items to increase the quality of the instrument for future work.

Keywords: critical thinking, collaboration, innovation skills, self-efficacy

Procedia PDF Downloads 216
1253 Hybrid Adaptive Modeling to Enhance Robustness of Real-Time Optimization

Authors: Hussain Syed Asad, Richard Kwok Kit Yuen, Gongsheng Huang

Abstract:

Real-time optimization has been considered an effective approach for improving energy efficient operation of heating, ventilation, and air-conditioning (HVAC) systems. In model-based real-time optimization, model mismatches cannot be avoided. When model mismatches are significant, the performance of the real-time optimization will be impaired and hence the expected energy saving will be reduced. In this paper, the model mismatches for chiller plant on real-time optimization are considered. In the real-time optimization of the chiller plant, simplified semi-physical or grey box model of chiller is always used, which should be identified using available operation data. To overcome the model mismatches associated with the chiller model, hybrid Genetic Algorithms (HGAs) method is used for online real-time training of the chiller model. HGAs combines Genetic Algorithms (GAs) method (for global search) and traditional optimization method (i.e. faster and more efficient for local search) to avoid conventional hit and trial process of GAs. The identification of model parameters is synthesized as an optimization problem; and the objective function is the Least Square Error between the output from the model and the actual output from the chiller plant. A case study is used to illustrate the implementation of the proposed method. It has been shown that the proposed approach is able to provide reliability in decision making, enhance the robustness of the real-time optimization strategy and improve on energy performance.

Keywords: energy performance, hybrid adaptive modeling, hybrid genetic algorithms, real-time optimization, heating, ventilation, and air-conditioning

Procedia PDF Downloads 418
1252 Use Multiphysics Simulations and Resistive Pulse Sensing to Study the Effect of Metal and Non-Metal Nanoparticles in Different Salt Concentration

Authors: Chun-Lin Chiang, Che-Yen Lee, Yu-Shan Yeh, Jiunn-Haur Shaw

Abstract:

Wafer fabrication is a critical part of the semiconductor process, when the finest linewidth with the improvement of technology continues to decline and the structure development from 2D towards to 3D. The nanoparticles contained in the slurry or in the ultrapure water which used for cleaning have a large influence on the manufacturing process. Therefore, semiconductor industry is hoping to find a viable method for on-line detection the nanoparticles size and concentration. The resistive pulse sensing technology is one of the methods that may cover this question. As we know that nanoparticles properties of material differ significantly from their properties at larger length scales. So, we want to clear that the metal and non-metal nanoparticles translocation dynamic when we use the resistive pulse sensing technology. In this study we try to use the finite element method that contains three governing equations to do multiphysics coupling simulations. The Navier-Stokes equation describes the laminar motion, the Nernst-Planck equation describes the ion transport, and the Poisson equation describes the potential distribution in the flow channel. To explore that the metal nanoparticles and the non-metal nanoparticles in different concentration electrolytes, through the nanochannel caused by ion current changes. Then the reliability of the simulation results was verified by resistive pulse sensing test. The existing results show that the lower ion concentration, the greater effect of nanoparticles on the ion concentration in the nanochannel. The conductive spikes are correlated with nanoparticles surface charge. Then we can be concluded that in the resistive pulse sensing technique, the ion concentration in the nanochannel and nanoparticle properties are important for the translocation dynamic, and they have the interactions.

Keywords: multiphysics simulations, resistive pulse sensing, nanoparticles, nanochannel

Procedia PDF Downloads 350
1251 Multimedia Technologies Utilisation as Predictors of Lecturers’ Teaching Effectiveness in Colleges of Education in South-West, Nigeria

Authors: Abel Olusegun Egunjobi, Olusegun Oyeleye Adesanya

Abstract:

Teaching effectiveness of lecturers in a tertiary institution in Nigeria is one of the determinants of the lecturer’s productivity. In this study, therefore, lecturers’ teaching effectiveness was examined vis-à-vis their multimedia technologies utilisation in Colleges of Education (CoE) in South-West, Nigeria. This is for the purpose of ascertaining the relationship and contribution of multimedia technologies utilisation to lecturers’ teaching effectiveness in Nigerian colleges of education. The descriptive survey research design was adopted in the study, while a multi-stage sampling procedure was used in the study. A stratified sampling technique was used to select colleges of education, and a simple random sampling method was employed to select lecturers from the selected colleges of education. A total of 862 lecturers (627 males and 235 females) were selected from the colleges of education used for the study. The instrument used was lecturers’ questionnaire on multimedia technologies utilisation and teaching effectiveness with a reliability coefficient of 0.85 at 0.05 level of significance. The data collected were analysed using descriptive statistics, multiple regression, and t-test. The findings showed that the level of multimedia technologies utilisation in colleges of education was low, whereas lecturers’ teaching effectiveness was high. Findings also revealed that the lecturers used multimedia technologies purposely for personal and professional developments, so also for up to date news on economic and political matters. Also, findings indicated that laptop, Ipad, CD-ROMs, and computer instructional software were the multimedia technologies frequently utilised by the lecturers. There was also a significant difference in the teaching effectiveness between lecturers in the Federal and State COE. The government should, therefore, make adequate provision for multimedia technologies in the COE in Nigeria for lecturers’ utilisation in their instructions so as to boost their students’ learning outcomes.

Keywords: colleges of education, lecturers’ teaching effectiveness, multimedia technologies utilisation, Southwest Nigeria

Procedia PDF Downloads 142
1250 Teacher's Gender and Primary School Pupils Achievement in Social Studies and Its Educational Implications on Pupils

Authors: Elizabeth Oyenike Abegunrin

Abstract:

This study is borne out of the dire need to improve the academic achievement of pupils in social studies. The paper attempted to reconcile the lacuna in teacher’s gender and primary school pupils’ achievement. With specific reference to Social Studies classroom, the aim of this study was to detail how pupils’ achievement is a function of the teacher’s gender as well as to establish the link (if any) between teacher’s gender and pupils’ educational achievement. The significance of this was to create gender-template standard for teachers, school owners, administrators and policy makers to follow in the course of engendering pupils’ achievement in Social Studies. By adopting a quasi-experimental research design, a sample of two hundred pupils was selected across five primary schools in Education District I, Lagos State and assigned to experimental and control groups. A 40-item Gender and Social Studies Achievement Test (GSSAT) was used to obtain data from the pupils. Having analyzed the data collected using Pearson Product Moment Correlation (PPMC), a reliability of 0.78 was obtained. Result revealed that teacher’s gender (male/female) had no significant effect on pupils’ achievement in Social Studies and that there was significant interaction effect of teacher’s commitment devoid of gender on the general education output of pupils in Social Studies. Taken together, the results revealed that there is a high degree correlation between teacher’s commitment and pupils academic achievement in social studies, and not gender-based. The study recommended that social studies teachers should re-assess their classroom instructional strategies and use more innovative instructional methods and techniques that will give the pupils equal opportunities to excel in social studies, rather than their gender differences.

Keywords: gender, academic achievement, social studies, primary school

Procedia PDF Downloads 211
1249 An Improved Robust Algorithm Based on Cubature Kalman Filter for Single-Frequency Global Navigation Satellite System/Inertial Navigation Tightly Coupled System

Authors: Hao Wang, Shuguo Pan

Abstract:

The Global Navigation Satellite System (GNSS) signal received by the dynamic vehicle in the harsh environment will be frequently interfered with and blocked, which generates gross error affecting the positioning accuracy of the GNSS/Inertial Navigation System (INS) integrated navigation. Therefore, this paper put forward an improved robust Cubature Kalman filter (CKF) algorithm for single-frequency GNSS/INS tightly coupled system ambiguity resolution. Firstly, the dynamic model and measurement model of a single-frequency GNSS/INS tightly coupled system was established, and the method for GNSS integer ambiguity resolution with INS aided is studied. Then, we analyzed the influence of pseudo-range observation with gross error on GNSS/INS integrated positioning accuracy. To reduce the influence of outliers, this paper improved the CKF algorithm and realized an intelligent selection of robust strategies by judging the ill-conditioned matrix. Finally, a field navigation test was performed to demonstrate the effectiveness of the proposed algorithm based on the double-differenced solution mode. The experiment has proved the improved robust algorithm can greatly weaken the influence of separate, continuous, and hybrid observation anomalies for enhancing the reliability and accuracy of GNSS/INS tightly coupled navigation solutions.

Keywords: GNSS/INS integrated navigation, ambiguity resolution, Cubature Kalman filter, Robust algorithm

Procedia PDF Downloads 100
1248 The Rational Design of Original Anticancer Agents Using Computational Approach

Authors: Majid Farsadrooh, Mehran Feizi-Dehnayebi

Abstract:

Serum albumin is the most abundant protein that is present in the circulatory system of a wide variety of organisms. Although it is a significant macromolecule, it can contribute to osmotic blood pressure and also, plays a superior role in drug disposition and efficiency. Molecular docking simulation can improve in silico drug design and discovery procedures to propound a lead compound and develop it from the discovery step to the clinic. In this study, the molecular docking simulation was applied to select a lead molecule through an investigation of the interaction of the two anticancer drugs (Alitretinoin and Abemaciclib) with Human Serum Albumin (HSA). Then, a series of new compounds (a-e) were suggested using lead molecule modification. Density functional theory (DFT) including MEP map and HOMO-LUMO analysis were used for the newly proposed compounds to predict the reactivity zones on the molecules, stability, and chemical reactivity. DFT calculation illustrated that these new compounds were stable. The estimated binding free energy (ΔG) values for a-e compounds were obtained as -5.78, -5.81, -5.95, -5,98, and -6.11 kcal/mol, respectively. Finally, the pharmaceutical properties and toxicity of these new compounds were estimated through OSIRIS DataWarrior software. The results indicated no risk of tumorigenic, irritant, or reproductive effects and mutagenicity for compounds d and e. As a result, compounds d and e, could be selected for further study as potential therapeutic candidates. Moreover, employing molecular docking simulation with the prediction of pharmaceutical properties helps to discover new potential drug compounds.

Keywords: drug design, anticancer, computational studies, DFT analysis

Procedia PDF Downloads 78
1247 Design and Radio Frequency Characterization of Radial Reentrant Narrow Gap Cavity for the Inductive Output Tube

Authors: Meenu Kaushik, Ayon K. Bandhoyadhayay, Lalit M. Joshi

Abstract:

Inductive output tubes (IOTs) are widely used as microwave power amplifiers for broadcast and scientific applications. It is capable of amplifying radio frequency (RF) power with very good efficiency. Its compactness, reliability, high efficiency, high linearity and low operating cost make this device suitable for various applications. The device consists of an integrated structure of electron gun and RF cavity, collector and focusing structure. The working principle of IOT is a combination of triode and klystron. The cathode lies in the electron gun produces a stream of electrons. A control grid is placed in close proximity to the cathode. Basically, the input part of IOT is the integrated structure of gridded electron gun which acts as an input cavity thereby providing the interaction gap where the input RF signal is applied to make it interact with the produced electron beam for supporting the amplification phenomena. The paper presents the design, fabrication and testing of a radial re-entrant cavity for implementing in the input structure of IOT at 350 MHz operating frequency. The model’s suitability has been discussed and a generalized mathematical relation has been introduced for getting the proper transverse magnetic (TM) resonating mode in the radial narrow gap RF cavities. The structural modeling has been carried out in CST and SUPERFISH codes. The cavity is fabricated with the Aluminum material and the RF characterization is done using vector network analyzer (VNA) and the results are presented for the resonant frequency peaks obtained in VNA.

Keywords: inductive output tubes, IOT, radial cavity, coaxial cavity, particle accelerators

Procedia PDF Downloads 125
1246 Comparative Assessment of Finite Element Methodologies for Predicting Post-Buckling Collapse in Stiffened Carbon Fiber-Reinforced Plastic (CFRP) Panels

Authors: Naresh Reddy Kolanu

Abstract:

The stability and collapse behavior of thin-walled composite structures, particularly carbon fiber-reinforced plastic (CFRP) panels, are paramount concerns for structural designers. Accurate prediction of collapse loads necessitates precise modeling of damage evolution in the post-buckling regime. This study conducts a comparative assessment of various finite element (FE) methodologies employed in predicting post-buckling collapse in stiffened CFRP panels. A systematic approach is adopted, wherein FE models with various damage capabilities are constructed and analyzed. The study investigates the influence of interacting intra- and interlaminar damage modes on the post-buckling response and failure behavior of the stiffened CFRP structure. Additionally, the capabilities of shell and brick FE-based models are evaluated and compared to determine their effectiveness in capturing the complex collapse behavior. Conclusions are drawn through quantitative comparison with experimental results, focusing on post-buckling response and collapse load. This comprehensive evaluation provides insights into the most effective FE methodologies for accurately predicting the collapse behavior of stiffened CFRP panels, thereby aiding structural designers in enhancing the stability and safety of composite structures.

Keywords: CFRP stiffened panels, delamination, Hashin’s failure, post-buckling, progressive damage model

Procedia PDF Downloads 44
1245 Use of Galileo Advanced Features in Maritime Domain

Authors: Olivier Chaigneau, Damianos Oikonomidis, Marie-Cecile Delmas

Abstract:

GAMBAS (Galileo Advanced features for the Maritime domain: Breakthrough Applications for Safety and security) is a project funded by the European Space Program Agency (EUSPA) aiming at identifying the search-and-rescue and ship security alert system needs for maritime users (including operators and fishing stakeholders) and developing operational concepts to answer these needs. The general objective of the GAMBAS project is to support the deployment of Galileo exclusive features in the maritime domain in order to improve safety and security at sea, detection of illegal activities and associated surveillance means, resilience to natural and human-induced emergency situations, and develop, integrate, demonstrate, standardize and disseminate these new associated capabilities. The project aims to demonstrate: improvement of the SAR (Search And Rescue) and SSAS (Ship Security Alert System) detection and response to maritime distress through the integration of new features into the beacon for SSAS in terms of cost optimization, user-friendly aspects, integration of Galileo and OS NMA (Open Service Navigation Message Authentication) reception for improved authenticated localization performance and reliability, and at sea triggering capabilities, optimization of the responsiveness of RCCs (Rescue Co-ordination Centre) towards the distress situations affecting vessels, the adaptation of the MCCs (Mission Control Center) and MEOLUT (Medium Earth Orbit Local User Terminal) to the data distribution of SSAS alerts.

Keywords: Galileo new advanced features, maritime, safety, security

Procedia PDF Downloads 93
1244 Quantification Model for Capability Evaluation of Optical-Based in-Situ Monitoring System for Laser Powder Bed Fusion (LPBF) Process

Authors: Song Zhang, Hui Wang, Johannes Henrich Schleifenbaum

Abstract:

Due to the increasing demand for quality assurance and reliability for additive manufacturing, the development of an advanced in-situ monitoring system is required to monitor the process anomalies as input for further process control. Optical-based monitoring systems, such as CMOS cameras and NIR cameras, are proved as effective ways to monitor the geometrical distortion and exceptional thermal distribution. Therefore, many studies and applications are focusing on the availability of the optical-based monitoring system for detecting varied types of defects. However, the capability of the monitoring setup is not quantified. In this study, a quantification model to evaluate the capability of the monitoring setups for the LPBF machine based on acquired monitoring data of a designed test artifact is presented, while the design of the relevant test artifacts is discussed. The monitoring setup is evaluated based on its hardware properties, location of the integration, and light condition. Methodology of data processing to quantify the capacity for each aspect is discussed. The minimal capability of the detectable size of the monitoring set up in the application is estimated by quantifying its resolution and accuracy. The quantification model is validated using a CCD camera-based monitoring system for LPBF machines in the laboratory with different setups. The result shows the model to quantify the monitoring system's performance, which makes the evaluation of monitoring systems with the same concept but different setups possible for the LPBF process and provides the direction to improve the setups.

Keywords: data processing, in-situ monitoring, LPBF process, optical system, quantization model, test artifact

Procedia PDF Downloads 197
1243 The Factors Affecting on Promoting Productivity from Nurses' View

Authors: Mahnaz Sanjari, Sedigheh Salemi, Mohammad Mirzabeigi

Abstract:

Nowadays, the world is facing a crisis of workforce and one of the most striking examples is the shortage of nurses. Nursing workforce productivity is related by various factors such as absenteeism, professional effectiveness and quality care. This cross-sectional study was conducted in 700 nurses who work in government hospitals from 35 hospitals of 9 provinces in Iran. The study was approved by the Nursing Council and was carried out with the authorization of the Research Ethics Committee. The questionnaire included 33 questions and 4 sub categories such as human resource, education and management. The reliability was evaluated by Cronbach's alpha (α=0/85). Statistical analyzes were performed, using SPSS version 16. The result showed that nurses emphasized on "respect to nurse-to-bed ratio" and less importance item was "using less experienced nurse". In addition, another important factor in clinical productivity is "Proper physical structure and amenities","good communication with colleagues" and "having good facilities". Also, "human resources at all levels of standard", "promoting on merit" and "well defined relationship in health system" are another important factors in productivity from nurse` view. The main managerial factor is "justice between employees" and the main educational component of productivity is “updating nursing knowledge”. The results show that more than half of the participants emphasized on the management and educational factors. Productivity as one of the main part of the health care quality leads to appropriate use of human and organizational resources, reduce cost services, and organizational development.

Keywords: productivity, nursing services, workforce, cost services

Procedia PDF Downloads 344
1242 MiRNA Regulation of CXCL12β during Inflammation

Authors: Raju Ranjha, Surbhi Aggarwal

Abstract:

Background: Inflammation plays an important role in infectious and non-infectious diseases. MiRNA is also reported to play role in inflammation and associated cancers. Chemokine CXCL12 is also known to play role in inflammation and various cancers. CXCL12/CXCR4 chemokine axis was involved in pathogenesis of IBD specially UC. Supplementation of CXCL12 induces homing of dendritic cells to spleen and enhances control of plasmodium parasite in BALB/c mice. We looked at the regulation of CXCL12β by miRNA in UC colitis. Prolonged inflammation of colon in UC patient increases the risk of developing colorectal cancer. We looked at the expression differences of CXCl12β and its targeting miRNA in cancer susceptible area of colon of UC patients. Aim: Aim of this study was to find out the expression regulation of CXCL12β by miRNA in inflammation. Materials and Methods: Biopsy samples and blood samples were collected from UC patients and non-IBD controls. mRNA expression was analyzed using microarray and real-time PCR. CXCL12β targeting miRNA were looked by using online target prediction tools. Expression of CXCL12β in blood samples and cell line supernatant was analyzed using ELISA. miRNA target was validated using dual luciferase assay. Results and conclusion: We found miR-200a regulate the expression of CXCL12β in UC. Expression of CXCL12β was increased in cancer susceptible part of colon and expression of its targeting miRNA was decreased in the same part of colon. miR-200a regulate CXCL12β expression in inflammation and may be an important therapeutic target in inflammation associated cancer.

Keywords: inflammation, miRNA, regulation, CXCL12

Procedia PDF Downloads 278
1241 Prediction of Distillation Curve and Reid Vapor Pressure of Dual-Alcohol Gasoline Blends Using Artificial Neural Network for the Determination of Fuel Performance

Authors: Leonard D. Agana, Wendell Ace Dela Cruz, Arjan C. Lingaya, Bonifacio T. Doma Jr.

Abstract:

The purpose of this paper is to study the predict the fuel performance parameters, which include drivability index (DI), vapor lock index (VLI), and vapor lock potential using distillation curve and Reid vapor pressure (RVP) of dual alcohol-gasoline fuel blends. Distillation curve and Reid vapor pressure were predicted using artificial neural networks (ANN) with macroscopic properties such as boiling points, RVP, and molecular weights as the input layers. The ANN consists of 5 hidden layers and was trained using Bayesian regularization. The training mean square error (MSE) and R-value for the ANN of RVP are 91.4113 and 0.9151, respectively, while the training MSE and R-value for the distillation curve are 33.4867 and 0.9927. Fuel performance analysis of the dual alcohol–gasoline blends indicated that highly volatile gasoline blended with dual alcohols results in non-compliant fuel blends with D4814 standard. Mixtures of low-volatile gasoline and 10% methanol or 10% ethanol can still be blended with up to 10% C3 and C4 alcohols. Intermediate volatile gasoline containing 10% methanol or 10% ethanol can still be blended with C3 and C4 alcohols that have low RVPs, such as 1-propanol, 1-butanol, 2-butanol, and i-butanol. Biography: Graduate School of Chemical, Biological, and Materials Engineering and Sciences, Mapua University, Muralla St., Intramuros, Manila, 1002, Philippines

Keywords: dual alcohol-gasoline blends, distillation curve, machine learning, reid vapor pressure

Procedia PDF Downloads 103
1240 Estimation of Maize Yield by Using a Process-Based Model and Remote Sensing Data in the Northeast China Plain

Authors: Jia Zhang, Fengmei Yao, Yanjing Tan

Abstract:

The accurate estimation of crop yield is of great importance for the food security. In this study, a process-based mechanism model was modified to estimate yield of C4 crop by modifying the carbon metabolic pathway in the photosynthesis sub-module of the RS-P-YEC (Remote-Sensing-Photosynthesis-Yield estimation for Crops) model. The yield was calculated by multiplying net primary productivity (NPP) and the harvest index (HI) derived from the ratio of grain to stalk yield. The modified RS-P-YEC model was used to simulate maize yield in the Northeast China Plain during the period 2002-2011. The statistical data of maize yield from study area was used to validate the simulated results at county-level. The results showed that the Pearson correlation coefficient (R) was 0.827 (P < 0.01) between the simulated yield and the statistical data, and the root mean square error (RMSE) was 712 kg/ha with a relative error (RE) of 9.3%. From 2002-2011, the yield of maize planting zone in the Northeast China Plain was increasing with smaller coefficient of variation (CV). The spatial pattern of simulated maize yield was consistent with the actual distribution in the Northeast China Plain, with an increasing trend from the northeast to the southwest. Hence the results demonstrated that the modified process-based model coupled with remote sensing data was suitable for yield prediction of maize in the Northeast China Plain at the spatial scale.

Keywords: process-based model, C4 crop, maize yield, remote sensing, Northeast China Plain

Procedia PDF Downloads 378
1239 An Enhanced Approach in Validating Analytical Methods Using Tolerance-Based Design of Experiments (DoE)

Authors: Gule Teri

Abstract:

The effective validation of analytical methods forms a crucial component of pharmaceutical manufacturing. However, traditional validation techniques can occasionally fail to fully account for inherent variations within datasets, which may result in inconsistent outcomes. This deficiency in validation accuracy is particularly noticeable when quantifying low concentrations of active pharmaceutical ingredients (APIs), excipients, or impurities, introducing a risk to the reliability of the results and, subsequently, the safety and effectiveness of the pharmaceutical products. In response to this challenge, we introduce an enhanced, tolerance-based Design of Experiments (DoE) approach for the validation of analytical methods. This approach distinctly measures variability with reference to tolerance or design margins, enhancing the precision and trustworthiness of the results. This method provides a systematic, statistically grounded validation technique that improves the truthfulness of results. It offers an essential tool for industry professionals aiming to guarantee the accuracy of their measurements, particularly for low-concentration components. By incorporating this innovative method, pharmaceutical manufacturers can substantially advance their validation processes, subsequently improving the overall quality and safety of their products. This paper delves deeper into the development, application, and advantages of this tolerance-based DoE approach and demonstrates its effectiveness using High-Performance Liquid Chromatography (HPLC) data for verification. This paper also discusses the potential implications and future applications of this method in enhancing pharmaceutical manufacturing practices and outcomes.

Keywords: tolerance-based design, design of experiments, analytical method validation, quality control, biopharmaceutical manufacturing

Procedia PDF Downloads 81
1238 Thermochemical Modelling for Extraction of Lithium from Spodumene and Prediction of Promising Reagents for the Roasting Process

Authors: Allen Yushark Fosu, Ndue Kanari, James Vaughan, Alexandre Changes

Abstract:

Spodumene is a lithium-bearing mineral of great interest due to increasing demand of lithium in emerging electric and hybrid vehicles. The conventional method of processing the mineral for the metal requires inevitable thermal transformation of α-phase to the β-phase followed by roasting with suitable reagents to produce lithium salts for downstream processes. The selection of appropriate reagent for roasting is key for the success of the process and overall lithium recovery. Several researches have been conducted to identify good reagents for the process efficiency, leading to sulfation, alkaline, chlorination, fluorination, and carbonizing as the methods of lithium recovery from the mineral.HSC Chemistry is a thermochemical software that can be used to model metallurgical process feasibility and predict possible reaction products prior to experimental investigation. The software was employed to investigate and explain the various reagent characteristics as employed in literature during spodumene roasting up to 1200°C. The simulation indicated that all used reagents for sulfation and alkaline were feasible in the direction of lithium salt production. Chlorination was only feasible when Cl2 and CaCl2 were used as chlorination agents but not NaCl nor KCl. Depending on the kind of lithium salt formed during carbonizing and fluorination, the process was either spontaneous or nonspontaneous throughout the temperature range investigated. The HSC software was further used to simulate and predict some promising reagents which may be equally good for roasting the mineral for efficient lithium extraction but have not yet been considered by researchers.

Keywords: thermochemical modelling, HSC chemistry software, lithium, spodumene, roasting

Procedia PDF Downloads 161
1237 Solving the Economic Load Dispatch Problem Using Differential Evolution

Authors: Alaa Sheta

Abstract:

Economic Load Dispatch (ELD) is one of the vital optimization problems in power system planning. Solving the ELD problems mean finding the best mixture of power unit outputs of all members of the power system network such that the total fuel cost is minimized while sustaining operation requirements limits satisfied across the entire dispatch phases. Many optimization techniques were proposed to solve this problem. A famous one is the Quadratic Programming (QP). QP is a very simple and fast method but it still suffer many problem as gradient methods that might trapped at local minimum solutions and cannot handle complex nonlinear functions. Numbers of metaheuristic algorithms were used to solve this problem such as Genetic Algorithms (GAs) and Particle Swarm Optimization (PSO). In this paper, another meta-heuristic search algorithm named Differential Evolution (DE) is used to solve the ELD problem in power systems planning. The practicality of the proposed DE based algorithm is verified for three and six power generator system test cases. The gained results are compared to existing results based on QP, GAs and PSO. The developed results show that differential evolution is superior in obtaining a combination of power loads that fulfill the problem constraints and minimize the total fuel cost. DE found to be fast in converging to the optimal power generation loads and capable of handling the non-linearity of ELD problem. The proposed DE solution is able to minimize the cost of generated power, minimize the total power loss in the transmission and maximize the reliability of the power provided to the customers.

Keywords: economic load dispatch, power systems, optimization, differential evolution

Procedia PDF Downloads 283