Search results for: robust estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3181

Search results for: robust estimation

1201 The Effect of Technology- facilitated Lesson Study toward Teacher’s Computer Assisted Language Learning Competencies

Authors: Yi-Ning Chang

Abstract:

With the rapid advancement of technology, it has become crucial for educators to adeptly integrate technology into their teaching and develop a robust Computer-Assisted Language Learning (CALL) competency. Addressing this need, the present study adopted a technology-based Lesson Study approach to assess its impact on the CALL competency and professional capabilities of EFL teachers. Additionally, the study delved into teachers' perceptions of the benefits derived from participating in the creation of technologically integrated lesson plans. The iterative process of technology-based Lesson Study facilitated ample peer discussion, enabling teachers to flexibly design and implement lesson plans that incorporate various technological tools. This 15-week study included 10 in- service teachers from a university of science and technology in the central of Taiwan. The collected data included pre- and post- lesson planning scores, pre- and post- TPACK survey scores, classroom observation forms, designed lesson plans, and reflective essays. The pre- and post- lesson planning and TPACK survey scores were analyzed employing a pair-sampled t test; students’ reflective essays were respectively analyzed applying content analysis. The findings revealed that the teachers’ lesson planning ability and CALL competencies were improved. Teachers perceived a better understanding of integrating technology with teaching subjects, more effective teaching skills, and a deeper understanding of technology. Pedagogical implications and future studies are also discussed.

Keywords: CALL, language learning, lesson study, lesson plan

Procedia PDF Downloads 28
1200 Carbon Stock Estimation of Urban Forests in Selected Public Parks in Addis Ababa

Authors: Meseret Habtamu, Mekuria Argaw

Abstract:

Urban forests can help to improve the microclimate and air quality. Urban forests in Addis Ababa are important sinks for GHGs as the number of vehicles and the traffic constrain is steadily increasing. The objective of this study was to characterize the vegetation types in selected public parks and to estimate the carbon stock potential of urban forests by assessing carbon in the above, below ground biomass, in the litter and soil. Species which vegetation samples were taken using a systematic transect sampling within value DBH ≥ 5cm were recorded to measure the above, the below ground biomass and the amount of C stored. Allometric models (Y= 34.4703 - 8.0671(DBH) + 0.6589(DBH2) were used to calculate the above ground and Below ground biomass (BGB) = AGB × 0.2 and sampling of soil and litter was based on quadrates. There were 5038 trees recorded from the selected study sites with DBH ≥ 5cm. Most of the Parks had large number of indigenous species, but the numbers of exotic trees are much larger than the indigenous trees. The mean above ground and below ground biomass is 305.7 ± 168.3 and 61.1± 33.7 respectively and the mean carbon in the above ground and below ground biomass is 143.3±74.2 and 28.1 ± 14.4 respectively. The mean CO2 in the above ground and below ground biomass is 525.9 ± 272.2 and 103.1 ± 52.9 respectively. The mean carbon in dead litter and soil carbon were 10.5 ± 2.4 and 69.2t ha-1 respectively. Urban trees reduce atmospheric carbon dioxide (CO2) through sequestration which is important for climate change mitigation, they are also important for recreational, medicinal value and aesthetic and biodiversity conservation.

Keywords: biodiversity, carbon sequestration, climate change, urban forests

Procedia PDF Downloads 219
1199 Laser Data Based Automatic Generation of Lane-Level Road Map for Intelligent Vehicles

Authors: Zehai Yu, Hui Zhu, Linglong Lin, Huawei Liang, Biao Yu, Weixin Huang

Abstract:

With the development of intelligent vehicle systems, a high-precision road map is increasingly needed in many aspects. The automatic lane lines extraction and modeling are the most essential steps for the generation of a precise lane-level road map. In this paper, an automatic lane-level road map generation system is proposed. To extract the road markings on the ground, the multi-region Otsu thresholding method is applied, which calculates the intensity value of laser data that maximizes the variance between background and road markings. The extracted road marking points are then projected to the raster image and clustered using a two-stage clustering algorithm. Lane lines are subsequently recognized from these clusters by the shape features of their minimum bounding rectangle. To ensure the storage efficiency of the map, the lane lines are approximated to cubic polynomial curves using a Bayesian estimation approach. The proposed lane-level road map generation system has been tested on urban and expressway conditions in Hefei, China. The experimental results on the datasets show that our method can achieve excellent extraction and clustering effect, and the fitted lines can reach a high position accuracy with an error of less than 10 cm.

Keywords: curve fitting, lane-level road map, line recognition, multi-thresholding, two-stage clustering

Procedia PDF Downloads 125
1198 Estimation of Pressure Profile and Boundary Layer Characteristics over NACA 4412 Airfoil

Authors: Anwar Ul Haque, Waqar Asrar, Erwin Sulaeman, Jaffar S. M. Ali

Abstract:

Pressure distribution data of the standard airfoils is usually used for the calibration purposes in subsonic wind tunnels. Results of such experiments are quite old and obtained by using the model in the spanwise direction. In this manuscript, pressure distribution over NACA 4412 airfoil model was presented by placing the 3D model in the lateral direction. The model is made of metal with pressure ports distributed longitudinally as well as in the lateral direction. The pressure model was attached to the floor of the tunnel with the help of the base plate to give the specified angle of attack to the model. Before the start of the experiments, the pressure tubes of the respective ports of the 128 ports pressure scanner are checked for leakage, and the losses due to the length of the pipes were also incorporated in the results for the specified pressure range. Growth rate maps of the boundary layer thickness were also plotted. It was found that with the increase in the velocity, the dynamic pressure distribution was also increased for the alpha seep. Plots of pressure distribution so obtained were overlapped with those obtained by using XFLR software, a low fidelity tool. It was found that at moderate and high angles of attack, the distribution of the pressure coefficients obtained from the experiments is high when compared with the XFLR ® results obtained along with the span of the wing. This under-prediction by XFLR ® is more obvious on the windward than on the leeward side.

Keywords: subsonic flow, boundary layer, wind tunnel, pressure testing

Procedia PDF Downloads 315
1197 Hybrid Approach for Face Recognition Combining Gabor Wavelet and Linear Discriminant Analysis

Authors: A: Annis Fathima, V. Vaidehi, S. Ajitha

Abstract:

Face recognition system finds many applications in surveillance and human computer interaction systems. As the applications using face recognition systems are of much importance and demand more accuracy, more robustness in the face recognition system is expected with less computation time. In this paper, a hybrid approach for face recognition combining Gabor Wavelet and Linear Discriminant Analysis (HGWLDA) is proposed. The normalized input grayscale image is approximated and reduced in dimension to lower the processing overhead for Gabor filters. This image is convolved with bank of Gabor filters with varying scales and orientations. LDA, a subspace analysis techniques are used to reduce the intra-class space and maximize the inter-class space. The techniques used are 2-dimensional Linear Discriminant Analysis (2D-LDA), 2-dimensional bidirectional LDA ((2D)2LDA), Weighted 2-dimensional bidirectional Linear Discriminant Analysis (Wt (2D)2 LDA). LDA reduces the feature dimension by extracting the features with greater variance. k-Nearest Neighbour (k-NN) classifier is used to classify and recognize the test image by comparing its feature with each of the training set features. The HGWLDA approach is robust against illumination conditions as the Gabor features are illumination invariant. This approach also aims at a better recognition rate using less number of features for varying expressions. The performance of the proposed HGWLDA approaches is evaluated using AT&T database, MIT-India face database and faces94 database. It is found that the proposed HGWLDA approach provides better results than the existing Gabor approach.

Keywords: face recognition, Gabor wavelet, LDA, k-NN classifier

Procedia PDF Downloads 464
1196 Efficient Implementation of Finite Volume Multi-Resolution Weno Scheme on Adaptive Cartesian Grids

Authors: Yuchen Yang, Zhenming Wang, Jun Zhu, Ning Zhao

Abstract:

An easy-to-implement and robust finite volume multi-resolution Weighted Essentially Non-Oscillatory (WENO) scheme is proposed on adaptive cartesian grids in this paper. Such a multi-resolution WENO scheme is combined with the ghost cell immersed boundary method (IBM) and wall-function technique to solve Navier-Stokes equations. Unlike the k-exact finite volume WENO schemes which involve large amounts of extra storage, repeatedly solving the matrix generated in a least-square method or the process of calculating optimal linear weights on adaptive cartesian grids, the present methodology only adds very small overhead and can be easily implemented in existing edge-based computational fluid dynamics (CFD) codes with minor modifications. Also, the linear weights of this adaptive finite volume multi-resolution WENO scheme can be any positive numbers on condition that their sum is one. It is a way of bypassing the calculation of the optimal linear weights and such a multi-resolution WENO scheme avoids dealing with the negative linear weights on adaptive cartesian grids. Some benchmark viscous problems are numerical solved to show the efficiency and good performance of this adaptive multi-resolution WENO scheme. Compared with a second-order edge-based method, the presented method can be implemented into an adaptive cartesian grid with slight modification for big Reynolds number problems.

Keywords: adaptive mesh refinement method, finite volume multi-resolution WENO scheme, immersed boundary method, wall-function technique.

Procedia PDF Downloads 143
1195 Exploration of a Blockchain Assisted Framework for Through Baggage Interlining: Blocklining

Authors: Mary Rose Everan, Michael McCann, Gary Cullen

Abstract:

International travel journeys, by their nature, incorporate elements provided by multiple service providers such as airlines, rail carriers, airports, and ground handlers. Data needs to be stored by and exchanged between these parties in the process of managing the journey. The fragmented nature of this shared management of mutual clients is a limiting factor in the development of a seamless, hassle-free, end-to-end travel experience. Traditional interlining agreements attempt to facilitate many separate aspects of co-operation between service providers, typically between airlines and, to some extent, intermodal travel operators, including schedules, fares, ticketing, through check-in, and baggage handling. These arrangements rely on pre-agreement. The development of Virtual Interlining - that is, interlining facilitated by a third party (often but not always an airport) without formal pre-agreement by the airlines or rail carriers - demonstrates an underlying demand for a better quality end-to-end travel experience. Blockchain solutions are being explored in a number of industries and offer, at first sight, an immutable, single source of truth for this data, avoiding data conflicts and misinterpretation. Combined with Smart Contracts, they seemingly offer a more robust and dynamic platform for multi-stakeholder ventures, and even perhaps the ability to join and leave consortia dynamically. Applying blockchain to the intermodal interlining space – termed Blocklining in this paper - is complex and multi-faceted because of the many aspects of cooperation outlined above. To explore its potential, this paper concentrates on one particular dimension, that of through baggage interlining.

Keywords: aviation, baggage, blocklining, intermodal, interlining

Procedia PDF Downloads 142
1194 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations

Authors: Ram Mohan, Richard Haney, Ajit Kelkar

Abstract:

Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.

Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance

Procedia PDF Downloads 359
1193 Particle Swarm Optimization Algorithm vs. Genetic Algorithm for Image Watermarking Based Discrete Wavelet Transform

Authors: Omaima N. Ahmad AL-Allaf

Abstract:

Over communication networks, images can be easily copied and distributed in an illegal way. The copyright protection for authors and owners is necessary. Therefore, the digital watermarking techniques play an important role as a valid solution for authority problems. Digital image watermarking techniques are used to hide watermarks into images to achieve copyright protection and prevent its illegal copy. Watermarks need to be robust to attacks and maintain data quality. Therefore, we discussed in this paper two approaches for image watermarking, first is based on Particle Swarm Optimization (PSO) and the second approach is based on Genetic Algorithm (GA). Discrete wavelet transformation (DWT) is used with the two approaches separately for embedding process to cover image transformation. Each of PSO and GA is based on co-relation coefficient to detect the high energy coefficient watermark bit in the original image and then hide the watermark in original image. Many experiments were conducted for the two approaches with different values of PSO and GA parameters. From experiments, PSO approach got better results with PSNR equal 53, MSE equal 0.0039. Whereas GA approach got PSNR equal 50.5 and MSE equal 0.0048 when using population size equal to 100, number of iterations equal to 150 and 3×3 block. According to the results, we can note that small block size can affect the quality of image watermarking based PSO/GA because small block size can increase the search area of the watermarking image. Better PSO results were obtained when using swarm size equal to 100.

Keywords: image watermarking, genetic algorithm, particle swarm optimization, discrete wavelet transform

Procedia PDF Downloads 220
1192 Estimation of Residual Stresses in Thick Walled Cylinder by Radial Basis Artificial Neural

Authors: Mohammad Heidari

Abstract:

In this paper a method for high strength steel is proposed of residual stresses in autofrettaged tubes by combination of artificial neural networks is presented. Many different thick walled cylinders that were subjected to different conditions were studied. At first, the residual stress is calculated by analytical solution. Then by changing of the parameters that influenced in residual stresses such as percentage of autofrettage, internal pressure, wall ratio of cylinder, material property of cylinder, bauschinger and hardening effect factor, a neural network is created. These parameters are the input of network. The output of network is residual stress. Numerical data, employed for training the network and capabilities of the model in predicting the residual stress has been verified. The output obtained from neural network model is compared with numerical results, and the amount of relative error has been calculated. Based on this verification error, it is shown that the radial basis function of neural network has the average error of 2.75% in predicting residual stress of thick wall cylinder. Further analysis of residual stress of thick wall cylinder under different input conditions has been investigated and comparison results of modeling with numerical considerations shows a good agreement, which also proves the feasibility and effectiveness of the adopted approach.

Keywords: thick walled cylinder, residual stress, radial basis, artificial neural network

Procedia PDF Downloads 411
1191 Development of a Consult Liaison Psychology Service: A Systematic Review

Authors: Ben J. Lippe

Abstract:

Consult Liaison Psychology services are overgrowing, given the robust empirical support of the utility of this service in hospital settings. These psychological services, including clinical assessment, applied psychotherapy, and consultation with other healthcare providers, have been shown to improve health outcomes for patients and bolster important areas of administrative interest such as decreased length of patient admission. However, there is little descriptive literature outlining the process and mechanisms of building or developing a Consult Liaison Psychology service. The main findings of this current conceptual work are intended to be clear in nature to elucidate the essential methods involved in developing consult liaison psychology programs, including thorough reviews of relevant behavioral health literature and inclusion of experiential outcomes. The diverse range of hospital settings and healthcare systems makes a “blueprint” method of program development challenging to define, yet important structural frameworks presented here based on the relevant literature and applied practice can help lay critical groundwork for program development in this growing area of psychological service. This conceptual approach addresses the prominent processes, as well as common programmatic and clinical pitfalls, involved in the event of a Consult Liaison Psychology service. This paper, including a systematic review of relevant literature, is intended to serve as a key program development reference for the development of Consult Liaison Psychology services, other related behavioral health programs, and to help inform further research efforts.

Keywords: behavioral health, consult liaison, health psychology, psychology program development

Procedia PDF Downloads 145
1190 Comprehensive Validation of High-Performance Liquid Chromatography-Diode Array Detection (HPLC-DAD) for Quantitative Assessment of Caffeic Acid in Phenolic Extracts from Olive Mill Wastewater

Authors: Layla El Gaini, Majdouline Belaqziz, Meriem Outaki, Mariam Minhaj

Abstract:

In this study, it introduce and validate a high-performance liquid chromatography method with diode-array detection (HPLC-DAD) specifically designed for the accurate quantification of caffeic acid in phenolic extracts obtained from olive mill wastewater. The separation process of caffeic acid was effectively achieved through the use of an Acclaim Polar Advantage column (5µm, 250x4.6mm). A meticulous multi-step gradient mobile phase was employed, comprising water acidified with phosphoric acid (pH 2.3) and acetonitrile, to ensure optimal separation. The diode-array detection was adeptly conducted within the UV–VIS spectrum, spanning a range of 200–800 nm, which facilitated precise analytical results. The method underwent comprehensive validation, addressing several essential analytical parameters, including specificity, repeatability, linearity, as well as the limits of detection and quantification, alongside measurement uncertainty. The generated linear standard curves displayed high correlation coefficients, underscoring the method's efficacy and consistency. This validated approach is not only robust but also demonstrates exceptional reliability for the focused analysis of caffeic acid within the intricate matrices of wastewater, thus offering significant potential for applications in environmental and analytical chemistry.

Keywords: high-performance liquid chromatography (HPLC-DAD), caffeic acid analysis, olive mill wastewater phenolics, analytical method validation

Procedia PDF Downloads 63
1189 Orthogonal Metal Cutting Simulation of Steel AISI 1045 via Smoothed Particle Hydrodynamic Method

Authors: Seyed Hamed Hashemi Sohi, Gerald Jo Denoga

Abstract:

Machining or metal cutting is one of the most widely used production processes in industry. The quality of the process and the resulting machined product depends on parameters like tool geometry, material, and cutting conditions. However, the relationships of these parameters to the cutting process are often based mostly on empirical knowledge. In this study, computer modeling and simulation using LS-DYNA software and a Smoothed Particle Hydrodynamic (SPH) methodology, was performed on the orthogonal metal cutting process to analyze three-dimensional deformation of AISI 1045 medium carbon steel during machining. The simulation was performed using the following constitutive models: the Power Law model, the Johnson-Cook model, and the Zerilli-Armstrong models (Z-A). The outcomes were compared against the simulated results obtained by Cenk Kiliçaslan using the Finite Element Method (FEM) and the empirical results of Jaspers and Filice. The analysis shows that the SPH method combined with the Zerilli-Armstrong constitutive model is a viable alternative to simulating the metal cutting process. The tangential force was overestimated by 7%, and the normal force was underestimated by 16% when compared with empirical values. The simulation values for flow stress versus strain at various temperatures were also validated against empirical values. The SPH method using the Z-A model has also proven to be robust against issues of time-scaling. Experimental work was also done to investigate the effects of friction, rake angle and tool tip radius on the simulation.

Keywords: metal cutting, smoothed particle hydrodynamics, constitutive models, experimental, cutting forces analyses

Procedia PDF Downloads 253
1188 A Bayesian Hierarchical Poisson Model with an Underlying Cluster Structure for the Analysis of Measles in Colombia

Authors: Ana Corberan-Vallet, Karen C. Florez, Ingrid C. Marino, Jose D. Bermudez

Abstract:

In 2016, the Region of the Americas was declared free of measles, a viral disease that can cause severe health problems. However, since 2017, measles has reemerged in Venezuela and has subsequently reached neighboring countries. In 2018, twelve American countries reported confirmed cases of measles. Governmental and health authorities in Colombia, a country that shares the longest land boundary with Venezuela, are aware of the need for a strong response to restrict the expanse of the epidemic. In this work, we apply a Bayesian hierarchical Poisson model with an underlying cluster structure to describe disease incidence in Colombia. Concretely, the proposed methodology provides relative risk estimates at the department level and identifies clusters of disease, which facilitates the implementation of targeted public health interventions. Socio-demographic factors, such as the percentage of migrants, gross domestic product, and entry routes, are included in the model to better describe the incidence of disease. Since the model does not impose any spatial correlation at any level of the model hierarchy, it avoids the spatial confounding problem and provides a suitable framework to estimate the fixed-effect coefficients associated with spatially-structured covariates.

Keywords: Bayesian analysis, cluster identification, disease mapping, risk estimation

Procedia PDF Downloads 148
1187 Artificial Intelligence in Bioscience: The Next Frontier

Authors: Parthiban Srinivasan

Abstract:

With recent advances in computational power and access to enough data in biosciences, artificial intelligence methods are increasingly being used in drug discovery research. These methods are essentially a series of advanced statistics based exercises that review the past to indicate the likely future. Our goal is to develop a model that accurately predicts biological activity and toxicity parameters for novel compounds. We have compiled a robust library of over 150,000 chemical compounds with different pharmacological properties from literature and public domain databases. The compounds are stored in simplified molecular-input line-entry system (SMILES), a commonly used text encoding for organic molecules. We utilize an automated process to generate an array of numerical descriptors (features) for each molecule. Redundant and irrelevant descriptors are eliminated iteratively. Our prediction engine is based on a portfolio of machine learning algorithms. We found Random Forest algorithm to be a better choice for this analysis. We captured non-linear relationship in the data and formed a prediction model with reasonable accuracy by averaging across a large number of randomized decision trees. Our next step is to apply deep neural network (DNN) algorithm to predict the biological activity and toxicity properties. We expect the DNN algorithm to give better results and improve the accuracy of the prediction. This presentation will review all these prominent machine learning and deep learning methods, our implementation protocols and discuss these techniques for their usefulness in biomedical and health informatics.

Keywords: deep learning, drug discovery, health informatics, machine learning, toxicity prediction

Procedia PDF Downloads 354
1186 Statistical Data Analysis of Migration Impact on the Spread of HIV Epidemic Model Using Markov Monte Carlo Method

Authors: Ofosuhene O. Apenteng, Noor Azina Ismail

Abstract:

Over the last several years, concern has developed over how to minimize the spread of HIV/AIDS epidemic in many countries. AIDS epidemic has tremendously stimulated the development of mathematical models of infectious diseases. The transmission dynamics of HIV infection that eventually developed AIDS has taken a pivotal role of much on building mathematical models. From the initial HIV and AIDS models introduced in the 80s, various improvements have been taken into account as how to model HIV/AIDS frameworks. In this paper, we present the impact of migration on the spread of HIV/AIDS. Epidemic model is considered by a system of nonlinear differential equations to supplement the statistical method approach. The model is calibrated using HIV incidence data from Malaysia between 1986 and 2011. Bayesian inference based on Markov Chain Monte Carlo is used to validate the model by fitting it to the data and to estimate the unknown parameters for the model. The results suggest that the migrants stay for a long time contributes to the spread of HIV. The model also indicates that susceptible individual becomes infected and moved to HIV compartment at a rate that is more significant than the removal rate from HIV compartment to AIDS compartment. The disease-free steady state is unstable since the basic reproduction number is 1.627309. This is a big concern and not a good indicator from the public heath point of view since the aim is to stabilize the epidemic at the disease equilibrium.

Keywords: epidemic model, HIV, MCMC, parameter estimation

Procedia PDF Downloads 592
1185 Experimental Investigation on Freeze-Concentration Process Desalting for Highly Saline Brines

Authors: H. Al-Jabli

Abstract:

Using the freeze-melting process for the disposing of high saline brines was the aim of the paper by confirming the performance estimation of the treatment system. A laboratory bench scale freezing technique test unit was designed, constructed, and tested at Doha Research Plant (DRP) in Kuwait. The principal unit operations that have been considered for the laboratory study are: ice crystallization, separation, washing, and melting. The applied process is characterized as “the secondary-refrigerant indirect freezing”, which is utilizing normal freezing concept. The high saline brine was used as definite feed water, i.e. average TDS of 250,000 ppm. Kuwait desalination plants were carried out in the experimental study to measure the performance of the proposed treatment system. Experimental analysis shows that the freeze-melting process is capable of dropping the TDS of the feed water from 249,482 ppm to 56,880 ppm of the freeze-melting process in the two-phase’s course, whereas overall recovery results of the salt passage and salt rejection are 31.11%, 19.05%, and 80.95%, correspondingly. Therefore, the freeze-melting process is encouraging for the proposed application, as it shows on the results, which approves the process capability of reducing a major amount of the dissolved salts of the high saline brine with reasonable sensible recovery. This process might be reasonable with other brine disposal processes.

Keywords: high saline brine, freeze-melting process, ice crystallization, brine disposal process

Procedia PDF Downloads 263
1184 Fermentable Bio-Ethanol Using Bakers and Palmwine Yeasts: Indices of Bioavailability of Carbohydrate and Sugar from Fungal Treated Rice Husk

Authors: Ezeonu, Chukwuma Stephen, Onwurah, Ikechukwu Noel Emmanuel

Abstract:

Pure strains of Aspergillus fumigatus (AF), aspergillus niger (AN), aspergillus oryzae (AO), trichophyton mentagrophyte (TM), trichophyton rubrum (TR) and Trichophyton soudanense (TS) were isolated from decomposing rice husk. Freshly processed rice husk in Mandle’s medium were heat pre-treated using an autoclave at 121oC for 20 minutes. The isolated fungi as monoculture and di-culture combinations were inoculated into each of the pre-treated rice husk with the exception of two controls. Seven days hydrolysis was followed by estimation of carbohydrate, reducing sugar and non-reducing sugar. Fungal treated rice husks were left to ferment for 7 days with introduction of both baker’s and palm wine yeast. The result obtained in the work gave the highest carbohydrate (20.53 ± 2.73 %) from rice husks treated with TS + TR di-culture. The highest soluble reducing sugar (2.66 ± 0.14 %) was obtained from rice husk treated with TM. The highest soluble nonreducing sugar (18.08 ± 2.61 %) was from AF. The introduction of yeasts from palm wine gave the highest bio-ethanol (12.82 ± 0.39 %) from AO. The highest bio-ethanol (6.60 ± 0.10 %) from baker's yeast fermentation was in AO + TS treated rice husk. There was increased availability of sugar and moderate yield of bio-ethanol, especially from palm wine yeast.

Keywords: fungi, rice husk, carbohydrate, reducing sugar, non-reducing sugar, ethanol, fermentation

Procedia PDF Downloads 434
1183 Estimating Anthropometric Dimensions for Saudi Males Using Artificial Neural Networks

Authors: Waleed Basuliman

Abstract:

Anthropometric dimensions are considered one of the important factors when designing human-machine systems. In this study, the estimation of anthropometric dimensions has been improved by using Artificial Neural Network (ANN) model that is able to predict the anthropometric measurements of Saudi males in Riyadh City. A total of 1427 Saudi males aged 6 to 60 years participated in measuring 20 anthropometric dimensions. These anthropometric measurements are considered important for designing the work and life applications in Saudi Arabia. The data were collected during eight months from different locations in Riyadh City. Five of these dimensions were used as predictors variables (inputs) of the model, and the remaining 15 dimensions were set to be the measured variables (Model’s outcomes). The hidden layers varied during the structuring stage, and the best performance was achieved with the network structure 6-25-15. The results showed that the developed Neural Network model was able to estimate the body dimensions of Saudi male population in Riyadh City. The network's mean absolute percentage error (MAPE) and the root mean squared error (RMSE) were found to be 0.0348 and 3.225, respectively. These results were found less, and then better, than the errors found in the literature. Finally, the accuracy of the developed neural network was evaluated by comparing the predicted outcomes with regression model. The ANN model showed higher coefficient of determination (R2) between the predicted and actual dimensions than the regression model.

Keywords: artificial neural network, anthropometric measurements, back-propagation

Procedia PDF Downloads 482
1182 Modeling Breathable Particulate Matter Concentrations over Mexico City Retrieved from Landsat 8 Satellite Imagery

Authors: Rodrigo T. Sepulveda-Hirose, Ana B. Carrera-Aguilar, Magnolia G. Martinez-Rivera, Pablo de J. Angeles-Salto, Carlos Herrera-Ventosa

Abstract:

In order to diminish health risks, it is of major importance to monitor air quality. However, this process is accompanied by the high costs of physical and human resources. In this context, this research is carried out with the main objective of developing a predictive model for concentrations of inhalable particles (PM10-2.5) using remote sensing. To develop the model, satellite images, mainly from Landsat 8, of the Mexico City’s Metropolitan Area were used. Using historical PM10 and PM2.5 measurements of the RAMA (Automatic Environmental Monitoring Network of Mexico City) and through the processing of the available satellite images, a preliminary model was generated in which it was possible to observe critical opportunity areas that will allow the generation of a robust model. Through the preliminary model applied to the scenes of Mexico City, three areas were identified that cause great interest due to the presumed high concentration of PM; the zones are those that present high plant density, bodies of water and soil without constructions or vegetation. To date, work continues on this line to improve the preliminary model that has been proposed. In addition, a brief analysis was made of six models, presented in articles developed in different parts of the world, this in order to visualize the optimal bands for the generation of a suitable model for Mexico City. It was found that infrared bands have helped to model in other cities, but the effectiveness that these bands could provide for the geographic and climatic conditions of Mexico City is still being evaluated.

Keywords: air quality, modeling pollution, particulate matter, remote sensing

Procedia PDF Downloads 149
1181 Entrepreneurship Education and Student Entrepreneurial Intention: A Comprehensive Review, Synthesis of Empirical Findings, and Strategic Insights for Future Research Advancements

Authors: Abdul Waris Jalili, Yanqing Wang, Som Suor

Abstract:

This research paper explores the relationship between entrepreneurship education and students' entrepreneurial intentions. It aims to determine if entrepreneurship education reliably predicts students' intention to become entrepreneurs and how and when this relationship occurs. This study aims to investigate the predictive relationship between entrepreneurship education and student entrepreneurial intentions. The goal is to understand the factors that influence this relationship and to identify any mediating or moderating factors. A thorough and systematic search and review of empirical articles published between 2013 and 2023 were conducted. Three databases, Google Scholar, Science Direct, and PubMed, were explored to gather relevant studies. Criteria such as reporting empirical results, publication in English, and addressing the research questions were used to select 35 papers for analysis. The collective findings of the reviewed studies suggest a generally positive relationship between entrepreneurship education and student entrepreneurial intentions. However, recent findings indicate that this relationship may be more complex than previously thought. Mediators and moderators have been identified, highlighting instances where entrepreneurship education indirectly influences student entrepreneurial intentions. The review also emphasizes the need for more robust research designs to establish causality in this field. This research adds to the existing literature by providing a comprehensive review of the relationship between entrepreneurship education and student entrepreneurial intentions. It highlights the complexity of this relationship and the importance of considering mediators and moderators. The study also calls for future research to explore different facets of entrepreneurship education independently and examine complex relationships more comprehensively.

Keywords: entrepreneurship, entrepreneurship education, entrepreneurial intention, entrepreneurial self-efficacy

Procedia PDF Downloads 56
1180 Inversion of the Spectral Analysis of Surface Waves Dispersion Curves through the Particle Swarm Optimization Algorithm

Authors: A. Cerrato Casado, C. Guigou, P. Jean

Abstract:

In this investigation, the particle swarm optimization (PSO) algorithm is used to perform the inversion of the dispersion curves in the spectral analysis of surface waves (SASW) method. This inverse problem usually presents complicated solution spaces with many local minima that make difficult the convergence to the correct solution. PSO is a metaheuristic method that was originally designed to simulate social behavior but has demonstrated powerful capabilities to solve inverse problems with complex space solution and a high number of variables. The dispersion curve of the synthetic soils is constructed by the vertical flexibility coefficient method, which is especially convenient for soils where the stiffness does not increase gradually with depth. The reason is that these types of soil profiles are not normally dispersive since the dominant mode of Rayleigh waves is usually not coincident with the fundamental mode. Multiple synthetic soil profiles have been tested to show the characteristics of the convergence process and assess the accuracy of the final soil profile. In addition, the inversion procedure is applied to multiple real soils and the final profile compared with the available information. The combination of the vertical flexibility coefficient method to obtain the dispersion curve and the PSO algorithm to carry out the inversion process proves to be a robust procedure that is able to provide good solutions for complex soil profiles even with scarce prior information.

Keywords: dispersion, inverse problem, particle swarm optimization, SASW, soil profile

Procedia PDF Downloads 178
1179 Robot Operating System-Based SLAM for a Gazebo-Simulated Turtlebot2 in 2d Indoor Environment with Cartographer Algorithm

Authors: Wilayat Ali, Li Sheng, Waleed Ahmed

Abstract:

The ability of the robot to make simultaneously map of the environment and localize itself with respect to that environment is the most important element of mobile robots. To solve SLAM many algorithms could be utilized to build up the SLAM process and SLAM is a developing area in Robotics research. Robot Operating System (ROS) is one of the frameworks which provide multiple algorithm nodes to work with and provide a transmission layer to robots. Manyof these algorithms extensively in use are Hector SLAM, Gmapping and Cartographer SLAM. This paper describes a ROS-based Simultaneous localization and mapping (SLAM) library Google Cartographer mapping, which is open-source algorithm. The algorithm was applied to create a map using laser and pose data from 2d Lidar that was placed on a mobile robot. The model robot uses the gazebo package and simulated in Rviz. Our research work's primary goal is to obtain mapping through Cartographer SLAM algorithm in a static indoor environment. From our research, it is shown that for indoor environments cartographer is an applicable algorithm to generate 2d maps with LIDAR placed on mobile robot because it uses both odometry and poses estimation. The algorithm has been evaluated and maps are constructed against the SLAM algorithms presented by Turtlebot2 in the static indoor environment.

Keywords: SLAM, ROS, navigation, localization and mapping, gazebo, Rviz, Turtlebot2, slam algorithms, 2d indoor environment, cartographer

Procedia PDF Downloads 142
1178 Development of a Web-Based Application for Intelligent Fertilizer Management in Rice Cultivation

Authors: Hao-Wei Fu, Chung-Feng Kao

Abstract:

In the era of rapid technological advancement, information technology (IT) has become integral to modern life, exerting significant influence across diverse sectors and serving as a catalyst for development in various industries. Within agriculture, the integration of IT offers substantial benefits, notably enhancing operational efficiency. Real-time monitoring systems, for instance, have been widely embraced in agriculture, effectively improving crop management practices. This study specifically addresses the management of rice panicle fertilizer, presenting the development of a web application tailored to handle data associated with rice panicle fertilizer management. Leveraging the normalized difference red edge index, this application optimizes the quantity of rice panicle fertilizer used, providing recommendations to agricultural stakeholders and service providers in the agricultural information sector. The overarching objective is to minimize costs while maximizing yields. Furthermore, a robust database system has been established to store and manage relevant data for future reference in rice cultivation management. Additionally, the study utilizes the Representational State Transfer software architectural style to construct an application programming interface (API), facilitating data creation, retrieval, updating, and deletion for users via the HyperText Transfer Protocol methods. Future plans involve integrating this API with third-party services to incorporate it into larger frameworks, thus catering to the diverse requirements of various third-party services.

Keywords: application programming interface, HyperText Transfer Protocol, nitrogen fertilizer intelligent management, web-based application

Procedia PDF Downloads 53
1177 Open-Loop Vector Control of Induction Motor with Space Vector Pulse Width Modulation Technique

Authors: Karchung, S. Ruangsinchaiwanich

Abstract:

This paper presents open-loop vector control method of induction motor with space vector pulse width modulation (SVPWM) technique. Normally, the closed loop speed control is preferred and is believed to be more accurate. However, it requires a position sensor to track the rotor position which is not desirable to use it for certain workspace applications. This paper exhibits the performance of three-phase induction motor with the simplest control algorithm without the use of a position sensor nor an estimation block to estimate rotor position for sensorless control. The motor stator currents are measured and are transformed to synchronously rotating (d-q-axis) frame by use of Clarke and Park transformation. The actual control happens in this frame where the measured currents are compared with the reference currents. The error signal is fed to a conventional PI controller, and the corrected d-q voltage is generated. The controller outputs are transformed back to three phase voltages and are fed to SVPWM block which generates PWM signal for the voltage source inverter. The open loop vector control model along with SVPWM algorithm is modeled in MATLAB/Simulink software and is experimented and validated in TMS320F28335 DSP board.

Keywords: electric drive, induction motor, open-loop vector control, space vector pulse width modulation technique

Procedia PDF Downloads 143
1176 Genome Editing in Sorghum: Advancements and Future Possibilities: A Review

Authors: Micheale Yifter Weldemichael, Hailay Mehari Gebremedhn, Teklehaimanot Hailesslasie

Abstract:

The advancement of target-specific genome editing tools, including clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated protein9 (Cas9), mega-nucleases, base editing (BE), prime editing (PE), transcription activator-like endonucleases (TALENs), and zinc-finger nucleases (ZFNs), have paved the way for a modern era of gene editing. CRISPR/Cas9, as a versatile, simple, cost-effective and robust system for genome editing, has dominated the genome manipulation field over the last few years. The application of CRISPR/Cas9 in sorghum improvement is particularly vital in the context of ecological, environmental and agricultural challenges, as well as global climate change. In this context, gene editing using CRISPR/Cas9 can improve nutritional value, yield, resistance to pests and disease and tolerance to different abiotic stress. Moreover, CRISPR/Cas9 can potentially perform complex editing to reshape already available elite varieties and new genetic variations. However, existing research is targeted at improving even further the effectiveness of the CRISPR/Cas9 genome editing techniques to fruitfully edit endogenous sorghum genes. These findings suggest that genome editing is a feasible and successful venture in sorghum. Newer improvements and developments of CRISPR/Cas9 techniques have further qualified researchers to modify extra genes in sorghum with improved efficiency. The fruitful application and development of CRISPR techniques for genome editing in sorghum will not only help in gene discovery, creating new, improved traits in sorghum regulating gene expression sorghum functional genomics, but also in making site-specific integration events.

Keywords: CRISPR/Cas9, genome editing, quality, sorghum, stress, yield

Procedia PDF Downloads 50
1175 Application of the EU Commission Waste Management Methodology Level(s) to a Construction and a Demolition in North-West Romania.

Authors: Valean Maria

Abstract:

Construction and demolition waste management is a timely topic, due to the urgency of its transition to sustainability. This sector is responsible for over a third of the waste generated in the E.U., while the legislation requires a proportion of at least 70% preparation for reuse and recycle, excluding backfilling. To this end, the E.U. Commission has provided the Level(s) methodology, allowing for the standardized planning and reporting of waste quantities across all levels of the construction process, from the architecture, to the demolition, from the estimation stage, to the actual measurements at the end of the operations. We applied Level(s) for the first time to the Romanian context, a developing E.U. country in which illegal dumping of contruction waste in nature and landfills, are still common practice. We performed the desk study of the buildings’ documents, followed by field studies of the sites, and finally the insertion and calculation of statistical data of the construction and demolition waste. We learned that Romania is far from the E.U. average in terms of the initial estimations of waste, with some numbers being higher, others lower, and that the price of evacuation to landfills is significantly lower in the developing country, a possible barrier to adopting the new regulations. Finally, we found that concrete is the predominant type waste, in terms of quantity as well as cost of disposal. Further directions of research are provided, such as mapping out all of the alternative facilities in the region and the calculation of the financial costs and of the CO2 footprint, for preparing and delivering waste sustainably, for a more sound and locally adapted model of waste management.

Keywords: construction, waste, management, levels, EU

Procedia PDF Downloads 74
1174 Agricultural Water Consumption Estimation in the Helmand Basin

Authors: Mahdi Akbari, Ali Torabi Haghighi

Abstract:

Hamun Lakes, located in the Helmand Basin, consisting of four water bodies, were the greatest (>8500 km2) freshwater bodies in Iran plateau but have almost entirely desiccated over the last 20 years. The desiccation of the lakes caused dust storm in the region which has huge economic and health consequences on the inhabitants. The flow of the Hirmand (or Helmand) River, the most important feeding river, has decreased from 4 to 1.9 km3 downstream due to anthropogenic activities. In this basin, water is mainly consumed for farming. Due to the lack of in-situ data in the basin, this research utilizes remote-sensing data to show how croplands and consequently consumed water in the agricultural sector have changed. Based on Landsat NDVI, we suggest using a threshold of around 0.35-0.4 to detect croplands in the basin. Croplands of this basin has doubled since 1990, especially in the downstream of the Kajaki Dam (the biggest dam of the basin). Using PML V2 Actual Evapotranspiration (AET) data and considering irrigation efficiency (≈0.3), we estimate that the consumed water (CW) for farming. We found that CW has increased from 2.5 to over 7.5 km3 from 2002 to 2017 in this basin. Also, the annual average Potential Evapotranspiration (PET) of the basin has had a negative trend in the recent years, although the AET over croplands has an increasing trend. In this research, using remote sensing data, we covered lack of data in the studied area and highlighted anthropogenic activities in the upstream which led to the lakes desiccation in the downstream.

Keywords: Afghanistan-Iran transboundary Basin, Iran-Afghanistan water treaty, water use, lake desiccation

Procedia PDF Downloads 124
1173 Application of Bim Model Data to Estimate ROI for Robots and Automation in Construction Projects

Authors: Brian Romansky

Abstract:

There are many practical, commercially available robots and semi-autonomous systems that are currently available for use in a wide variety of construction tasks. Adoption of these technologies has the potential to reduce the time and cost to deliver a project, reduce variability and risk in delivery time, increase quality, and improve safety on the job site. These benefits come with a cost for equipment rental or contract fees, access to specialists to configure the system, and time needed for set-up and support of the machines while in use. Calculation of the net ROI (Return on Investment) requires detailed information about the geometry of the site, the volume of work to be done, the overall project schedule, as well as data on the capabilities and past performance of available robotic systems. Assembling the required data and comparing the ROI for several options is complex and tedious. Many project managers will only consider the use of a robot in targeted applications where the benefits are obvious, resulting in low levels of adoption of automation in the construction industry. This work demonstrates how data already resident in many BIM (Building Information Model) projects can be used to automate ROI estimation for a sample set of commercially available construction robots. Calculations account for set-up and operating time along with scheduling support tasks required while the automated technology is in use. Configuration parameters allow for prioritization of time, cost, or safety as the primary benefit of the technology. A path toward integration and use of automatic ROI calculation with a database of available robots in a BIM platform is described.

Keywords: automation, BIM, robot, ROI.

Procedia PDF Downloads 80
1172 Role of Kerala’s Diaspora Philanthropy Engagement During Economic Crises

Authors: Shibinu S, Mohamed Haseeb N

Abstract:

In times of crisis, the diaspora's role and the help it offers are seen to be vital in determining how many countries, particularly low- and middle-income nations that significantly rely on remittances, recover. Twenty-one lakh twenty thousand Keralites have emigrated abroad, with 81.2 percent of these outflows occurring in the Gulf Cooperative Council (GCC). Most of them are semi-skilled or low-skilled laborers employed in GCC nations. Additionally, a sizeable portion of migrants are employed in industrialized nations like the UK and the US. These nations have seen the development of a highly robust Indian Diaspora. India's development is largely dependent on the generosity of its diaspora, and the nation has benefited greatly from the substantial contributions made by several emigrant generations. Its strength was noticeable during the COVID-19 and Kerala floods. Millions of people were displaced, millions of properties were damaged, and many people died as a result of the 2018 Kerala floods. The Malayalee diaspora played a crucial role in the reconstruction of Kerala by providing support for the rescue efforts underway on the ground through their extensive worldwide network. During COVID-19, an analogous outreach was also noted, in which the diaspora assisted stranded migrants across the globe. Together with the work the diaspora has done for the state's development and recovery, there has also been a recent outpouring of assistance during the COVID-19 pandemic. The study focuses on the subtleties of diaspora philanthropic scholarship and how Kerala was able to recover from the COVID-19 pandemic and floods thanks to it. Semi-structured in-depth interviews with migrants, migrant organizations, and beneficiaries from the diaspora through snowball sampling to better understand the role that diaspora philanthropy plays in times of crisis.

Keywords: crises, diaspora, remittances, COVID-19, flood, economic development of Kerala

Procedia PDF Downloads 29