Search results for: using an Anisotropic Analytical Algorithm (AAA)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5874

Search results for: using an Anisotropic Analytical Algorithm (AAA)

1074 Bioethanol Production from Marine Algae Ulva Lactuca and Sargassum Swartzii: Saccharification and Process Optimization

Authors: M. Jerold, V. Sivasubramanian, A. George, B.S. Ashik, S. S. Kumar

Abstract:

Bioethanol is a sustainable biofuel that can be used alternative to fossil fuels. Today, third generation (3G) biofuel is gaining more attention than first and second-generation biofuel. The more lignin content in the lignocellulosic biomass is the major drawback of second generation biofuels. Algae are the renewable feedstock used in the third generation biofuel production. Algae contain a large number of carbohydrates, therefore it can be used for the fermentation by hydrolysis process. There are two groups of Algae, such as micro and macroalgae. In the present investigation, Macroalgae was chosen as raw material for the production of bioethanol. Two marine algae viz. Ulva Lactuca and Sargassum swartzii were used for the experimental studies. The algal biomass was characterized using various analytical techniques like Elemental Analysis, Scanning Electron Microscopy Analysis and Fourier Transform Infrared Spectroscopy to understand the physio-Chemical characteristics. The batch experiment was done to study the hydrolysis and operation parameters such as pH, agitation, fermentation time, inoculum size. The saccharification was done with acid and alkali treatment. The experimental results showed that NaOH treatment was shown to enhance the bioethanol. From the hydrolysis study, it was found that 0.5 M Alkali treatment would serve as optimum concentration for the saccharification of polysaccharide sugar to monomeric sugar. The maximum yield of bioethanol was attained at a fermentation time of 9 days. The inoculum volume of 1mL was found to be lowest for the ethanol fermentation. The agitation studies show that the fermentation was higher during the process. The percentage yield of bioethanol was found to be 22.752% and 14.23 %. The elemental analysis showed that S. swartzii contains a higher carbon source. The results confirmed hydrolysis was not completed to recover the sugar from biomass. The specific gravity of ethanol was found to 0.8047 and 0.808 for Ulva Lactuca and Sargassum swartzii, respectively. The purity of bioethanol also studied and found to be 92.55 %. Therefore, marine algae can be used as a most promising renewable feedstock for the production of bioethanol.

Keywords: algae, biomass, bioethaol, biofuel, pretreatment

Procedia PDF Downloads 136
1073 An Analytical Study of the Quality of Educational Administration and Management At Secondary School Level in Punjab, Pakistan

Authors: Shamim Akhtar

Abstract:

The purpose of the present research was to analyse the performance level of district administrators and school heads teachers at secondary school level. The sample of the study was head teachers and teachers of secondary schools. In survey three scales were used, two scales were for the head teachers, one five point scale was for analysing the working efficiency of educational administrators and other seven points scale was for head teachers for analysing their own performance and one another seven point rating scale similar to head teacher was for the teachers for analysing the working performance of their head teachers. The results of the head teachers’ responses revealed that the performance of their District Educational Administrators was average and for the performance efficiency of the head teachers, researcher constructed the rating scales on seven parameters of management likely academic management, personnel management, financial management, infra-structure management, linkage and interface, student’s services, and managerial excellence. Results of percentages, means, and graphical presentation on different parameters of management showed that there was an obvious difference in head teachers and teachers’ responses and head teachers probably were overestimating their efficiency; but teachers evaluated that they were performing averagely on majority statements. Results of t-test showed that there was no significance difference in the responses of rural and urban teachers but significant difference in male and female teachers’ responses showed that female head teachers were performing their responsibilities better than male head teachers in public sector schools. When efficiency of the head teachers on different parameters of management were analysed it was concluded that their efficiency on academic and personnel management was average and on financial management and on managerial excellence was highly above of average level but on others parameters like infra-structure management, linkage and interface and on students services was above of average level on most statements but highly above of average on some statements. Hence there is need to improve the working efficiency in academic management and personnel management.

Keywords: educational administration, educational management, parameters of management, education

Procedia PDF Downloads 312
1072 Fault Detection and Isolation in Sensors and Actuators of Wind Turbines

Authors: Shahrokh Barati, Reza Ramezani

Abstract:

Due to the countries growing attention to the renewable energy producing, the demand for energy from renewable energy has gone up among the renewable energy sources; wind energy is the fastest growth in recent years. In this regard, in order to increase the availability of wind turbines, using of Fault Detection and Isolation (FDI) system is necessary. Wind turbines include of various faults such as sensors fault, actuator faults, network connection fault, mechanical faults and faults in the generator subsystem. Although, sensors and actuators have a large number of faults in wind turbine but have discussed fewer in the literature. Therefore, in this work, we focus our attention to design a sensor and actuator fault detection and isolation algorithm and Fault-tolerant control systems (FTCS) for Wind Turbine. The aim of this research is to propose a comprehensive fault detection and isolation system for sensors and actuators of wind turbine based on data-driven approaches. To achieve this goal, the features of measurable signals in real wind turbine extract in any condition. The next step is the feature selection among the extract in any condition. The next step is the feature selection among the extracted features. Features are selected that led to maximum separation networks that implemented in parallel and results of classifiers fused together. In order to maximize the reliability of decision on fault, the property of fault repeatability is used.

Keywords: FDI, wind turbines, sensors and actuators faults, renewable energy

Procedia PDF Downloads 377
1071 Corruption, a Prelude to Problems of Governance in Pakistan

Authors: Umbreen Javaid

Abstract:

Pakistan’s experience with nascent, yet to be evolved democratic institutions inherited from the British Empire, has not been a pleasant one when evaluated in terms of good governance, development, and success of anti-corruption mechanisms. The country has remained entangled in a vicious circle of accumulating large budget deficits, dwindling economy, low foreign direct investment, political instability, and rising terrorism. It is thus not surprising that no account of the state aimed at analyzing the six-decade journey since her inception is replete with negative connotations like dysfunctional, failed, fragile or weak state. The limited pool of experience of handling democratic institutions and lack of political will be on the part of country’s political elite to transform the society on democratic footings have left Pakistan as a “limited access order” state. The widespread illiteracy becomes a double edge sword when a largely illiterate electorate elects representatives who mostly come from a semi-educated background with the limited understanding of democratic minutiae and little or no proclivity to resist monetary allures. The prevalence of culture of patronage with widespread poverty coupled with absence of a comprehensive system of investigating, prosecuting and adjudicating cases of corruption encourage the practice that has been eroding the state’s foundations since her inception owing to the unwillingness of the traditional elites who have been strongly resistant towards any attempts aimed at disseminating powers. An analytical study of the historical, political, cultural, economic and administrative hurdles that have been at work in impeding Pakistan’s transition to a democratic, accountable society would be instrumental in understanding the issue of widespread plague of corruption and state’s inefficiency to cope with it effectively. The issue of corruption in Pakistan becomes more important when seen in the context of her vulnerability to terrorism and religious extremism. In this regard, Pakistan needs to learn a lot from developed countries in order to evolve a comprehensive strategy for combating and preventing this pressing issue.

Keywords: Pakistan, corruption, anti-corruption, limited access order

Procedia PDF Downloads 281
1070 Advancing Environmental Remediation Through the Production of Functional Porous Materials from Phosphorite Residue Tailings

Authors: Ali Mohammed Yimer, Ayalew Assen, Youssef Belmabkhout

Abstract:

Environmental remediation is a pressing global concern, necessitating innovative strategies to address the challenges posed by industrial waste and pollution. This study aims to advance environmental remediation by developing cutting-edge functional porous materials from phosphorite residue tailings. Phosphorite mining activities generate vast amounts of waste, which pose significant environmental risks due to their contaminants. The proposed approach involved transforming these phosphorite residue tailings into valuable porous materials through a series of physico-chemical processes including milling, acid-base leaching, designing or templating as well as formation processes. The key components of the tailings were extracted and processed to produce porous arrays with high surface area and porosity. These materials were engineered to possess specific properties suitable for environmental remediation applications, such as enhanced adsorption capacity and selectivity for target contaminants. The synthesized porous materials were thoroughly characterized using advanced analytical techniques (XRD, SEM-EDX, N2 sorption, TGA, FTIR) to assess their structural, morphological, and chemical properties. The performance of the materials in removing various pollutants, including heavy metals and organic compounds, were evaluated through batch adsorption experiments. Additionally, the potential for material regeneration and reusability was investigated to enhance the sustainability of the proposed remediation approach. The outdoors of this research holds significant promise for addressing the environmental challenges associated with phosphorite residue tailings. By valorizing these waste materials into porous materials with exceptional remediation capabilities, this study contributes to the development of sustainable and cost-effective solutions for environmental cleanup. Furthermore, the utilization of phosphorite residue tailings in this manner offers a potential avenue for the remediation of other contaminated sites, thereby fostering a circular economy approach to waste management.

Keywords: functional porous materials, phosphorite residue tailings, adsorption, environmental remediation, sustainable solutions

Procedia PDF Downloads 34
1069 Deep Reinforcement Learning Model for Autonomous Driving

Authors: Boumaraf Malak

Abstract:

The development of intelligent transportation systems (ITS) and artificial intelligence (AI) are spurring us to pave the way for the widespread adoption of autonomous vehicles (AVs). This is open again opportunities for smart roads, smart traffic safety, and mobility comfort. A highly intelligent decision-making system is essential for autonomous driving around dense, dynamic objects. It must be able to handle complex road geometry and topology, as well as complex multiagent interactions, and closely follow higher-level commands such as routing information. Autonomous vehicles have become a very hot research topic in recent years due to their significant ability to reduce traffic accidents and personal injuries. Using new artificial intelligence-based technologies handles important functions in scene understanding, motion planning, decision making, vehicle control, social behavior, and communication for AV. This paper focuses only on deep reinforcement learning-based methods; it does not include traditional (flat) planar techniques, which have been the subject of extensive research in the past because reinforcement learning (RL) has become a powerful learning framework now capable of learning complex policies in high dimensional environments. The DRL algorithm used so far found solutions to the four main problems of autonomous driving; in our paper, we highlight the challenges and point to possible future research directions.

Keywords: deep reinforcement learning, autonomous driving, deep deterministic policy gradient, deep Q-learning

Procedia PDF Downloads 55
1068 Joint Modeling of Longitudinal and Time-To-Event Data with Latent Variable

Authors: Xinyuan Y. Song, Kai Kang

Abstract:

Joint models for analyzing longitudinal and survival data are widely used to investigate the relationship between a failure time process and time-variant predictors. A common assumption in conventional joint models in the survival analysis literature is that all predictors are observable. However, this assumption may not always be supported because unobservable traits, namely, latent variables, which are indirectly observable and should be measured through multiple observed variables, are commonly encountered in the medical, behavioral, and financial research settings. In this study, a joint modeling approach to deal with this feature is proposed. The proposed model comprises three parts. The first part is a dynamic factor analysis model for characterizing latent variables through multiple observed indicators over time. The second part is a random coefficient trajectory model for describing the individual trajectories of latent variables. The third part is a proportional hazard model for examining the effects of time-invariant predictors and the longitudinal trajectories of time-variant latent risk factors on hazards of interest. A Bayesian approach coupled with a Markov chain Monte Carlo algorithm to perform statistical inference. An application of the proposed joint model to a study on the Alzheimer's disease neuroimaging Initiative is presented.

Keywords: Bayesian analysis, joint model, longitudinal data, time-to-event data

Procedia PDF Downloads 117
1067 Documenting the 15th Century Prints with RTI

Authors: Peter Fornaro, Lothar Schmitt

Abstract:

The Digital Humanities Lab and the Institute of Art History at the University of Basel are collaborating in the SNSF research project ‘Digital Materiality’. Its goal is to develop and enhance existing methods for the digital reproduction of cultural heritage objects in order to support art historical research. One part of the project focuses on the visualization of a small eye-catching group of early prints that are noteworthy for their subtle reliefs and glossy surfaces. Additionally, this group of objects – known as ‘paste prints’ – is characterized by its fragile state of preservation. Because of the brittle substances that were used for their production, most paste prints are heavily damaged and thus very hard to examine. These specific material properties make a photographic reproduction extremely difficult. To obtain better results we are working with Reflectance Transformation Imaging (RTI), a computational photographic method that is already used in archaeological and cultural heritage research. This technique allows documenting how three-dimensional surfaces respond to changing lighting situations. Our first results show that RTI can capture the material properties of paste prints and their current state of preservation more accurately than conventional photographs, although there are limitations with glossy surfaces because the mathematical models that are included in RTI are kept simple in order to keep the software robust and easy to use. To improve the method, we are currently developing tools for a more detailed analysis and simulation of the reflectance behavior. An enhanced analytical model for the representation and visualization of gloss will increase the significance of digital representations of cultural heritage objects. For collaborative efforts, we are working on a web-based viewer application for RTI images based on WebGL in order to make acquired data accessible to a broader international research community. At the ICDH Conference, we would like to present unpublished results of our work and discuss the implications of our concept for art history, computational photography and heritage science.

Keywords: art history, computational photography, paste prints, reflectance transformation imaging

Procedia PDF Downloads 260
1066 Detecting Natural Fractures and Modeling Them to Optimize Field Development Plan in Libyan Deep Sandstone Reservoir (Case Study)

Authors: Tarek Duzan

Abstract:

Fractures are a fundamental property of most reservoirs. Despite their abundance, they remain difficult to detect and quantify. The most effective characterization of fractured reservoirs is accomplished by integrating geological, geophysical, and engineering data. Detection of fractures and defines their relative contribution is crucial in the early stages of exploration and later in the production of any field. Because fractures could completely change our thoughts, efforts, and planning to produce a specific field properly. From the structural point of view, all reservoirs are fractured to some point of extent. North Gialo field is thought to be a naturally fractured reservoir to some extent. Historically, natural fractured reservoirs are more complicated in terms of their exploration and production efforts, and most geologists tend to deny the presence of fractures as an effective variable. Our aim in this paper is to determine the degree of fracturing, and consequently, our evaluation and planning can be done properly and efficiently from day one. The challenging part in this field is that there is no enough data and straightforward well testing that can let us completely comfortable with the idea of fracturing; however, we cannot ignore the fractures completely. Logging images, available well testing, and limited core studies are our tools in this stage to evaluate, model, and predict possible fracture effects in this reservoir. The aims of this study are both fundamental and practical—to improve the prediction and diagnosis of natural-fracture attributes in N. Gialo hydrocarbon reservoirs and accurately simulate their influence on production. Moreover, the production of this field comes from 2-phase plan; a self depletion of oil and then gas injection period for pressure maintenance and increasing ultimate recovery factor. Therefore, well understanding of fracturing network is essential before proceeding with the targeted plan. New analytical methods will lead to more realistic characterization of fractured and faulted reservoir rocks. These methods will produce data that can enhance well test and seismic interpretations, and that can readily be used in reservoir simulators.

Keywords: natural fracture, sandstone reservoir, geological, geophysical, and engineering data

Procedia PDF Downloads 75
1065 Teaching Tools for Web Processing Services

Authors: Rashid Javed, Hardy Lehmkuehler, Franz Josef-Behr

Abstract:

Web Processing Services (WPS) have up growing concern in geoinformation research. However, teaching about them is difficult because of the generally complex circumstances of their use. They limit the possibilities for hands- on- exercises on Web Processing Services. To support understanding however a Training Tools Collection was brought on the way at University of Applied Sciences Stuttgart (HFT). It is limited to the scope of Geostatistical Interpolation of sample point data where different algorithms can be used like IDW, Nearest Neighbor etc. The Tools Collection aims to support understanding of the scope, definition and deployment of Web Processing Services. For example it is necessary to characterize the input of Interpolation by the data set, the parameters for the algorithm and the interpolation results (here a grid of interpolated values is assumed). This paper reports on first experiences using a pilot installation. This was intended to find suitable software interfaces for later full implementations and conclude on potential user interface characteristics. Experiences were made with Deegree software, one of several Services Suites (Collections). Being strictly programmed in Java, Deegree offers several OGC compliant Service Implementations that also promise to be of benefit for the project. The mentioned parameters for a WPS were formalized following the paradigm that any meaningful component will be defined in terms of suitable standards. E.g. the data output can be defined as a GML file. But, the choice of meaningful information pieces and user interactions is not free but partially determined by the selected WPS Processing Suite.

Keywords: deegree, interpolation, IDW, web processing service (WPS)

Procedia PDF Downloads 332
1064 Multi-Objective Evolutionary Computation Based Feature Selection Applied to Behaviour Assessment of Children

Authors: F. Jiménez, R. Jódar, M. Martín, G. Sánchez, G. Sciavicco

Abstract:

Abstract—Attribute or feature selection is one of the basic strategies to improve the performances of data classification tasks, and, at the same time, to reduce the complexity of classifiers, and it is a particularly fundamental one when the number of attributes is relatively high. Its application to unsupervised classification is restricted to a limited number of experiments in the literature. Evolutionary computation has already proven itself to be a very effective choice to consistently reduce the number of attributes towards a better classification rate and a simpler semantic interpretation of the inferred classifiers. We present a feature selection wrapper model composed by a multi-objective evolutionary algorithm, the clustering method Expectation-Maximization (EM), and the classifier C4.5 for the unsupervised classification of data extracted from a psychological test named BASC-II (Behavior Assessment System for Children - II ed.) with two objectives: Maximizing the likelihood of the clustering model and maximizing the accuracy of the obtained classifier. We present a methodology to integrate feature selection for unsupervised classification, model evaluation, decision making (to choose the most satisfactory model according to a a posteriori process in a multi-objective context), and testing. We compare the performance of the classifier obtained by the multi-objective evolutionary algorithms ENORA and NSGA-II, and the best solution is then validated by the psychologists that collected the data.

Keywords: evolutionary computation, feature selection, classification, clustering

Procedia PDF Downloads 342
1063 Using Deep Learning Real-Time Object Detection Convolution Neural Networks for Fast Fruit Recognition in the Tree

Authors: K. Bresilla, L. Manfrini, B. Morandi, A. Boini, G. Perulli, L. C. Grappadelli

Abstract:

Image/video processing for fruit in the tree using hard-coded feature extraction algorithms have shown high accuracy during recent years. While accurate, these approaches even with high-end hardware are computationally intensive and too slow for real-time systems. This paper details the use of deep convolution neural networks (CNNs), specifically an algorithm (YOLO - You Only Look Once) with 24+2 convolution layers. Using deep-learning techniques eliminated the need for hard-code specific features for specific fruit shapes, color and/or other attributes. This CNN is trained on more than 5000 images of apple and pear fruits on 960 cores GPU (Graphical Processing Unit). Testing set showed an accuracy of 90%. After this, trained data were transferred to an embedded device (Raspberry Pi gen.3) with camera for more portability. Based on correlation between number of visible fruits or detected fruits on one frame and the real number of fruits on one tree, a model was created to accommodate this error rate. Speed of processing and detection of the whole platform was higher than 40 frames per second. This speed is fast enough for any grasping/harvesting robotic arm or other real-time applications.

Keywords: artificial intelligence, computer vision, deep learning, fruit recognition, harvesting robot, precision agriculture

Procedia PDF Downloads 389
1062 Using Machine Learning to Classify Human Fetal Health and Analyze Feature Importance

Authors: Yash Bingi, Yiqiao Yin

Abstract:

Reduction of child mortality is an ongoing struggle and a commonly used factor in determining progress in the medical field. The under-5 mortality number is around 5 million around the world, with many of the deaths being preventable. In light of this issue, Cardiotocograms (CTGs) have emerged as a leading tool to determine fetal health. By using ultrasound pulses and reading the responses, CTGs help healthcare professionals assess the overall health of the fetus to determine the risk of child mortality. However, interpreting the results of the CTGs is time-consuming and inefficient, especially in underdeveloped areas where an expert obstetrician is hard to come by. Using a support vector machine (SVM) and oversampling, this paper proposed a model that classifies fetal health with an accuracy of 99.59%. To further explain the CTG measurements, an algorithm based on Randomized Input Sampling for Explanation ((RISE) of Black-box Models was created, called Feature Alteration for explanation of Black Box Models (FAB), and compared the findings to Shapley Additive Explanations (SHAP) and Local Interpretable Model Agnostic Explanations (LIME). This allows doctors and medical professionals to classify fetal health with high accuracy and determine which features were most influential in the process.

Keywords: machine learning, fetal health, gradient boosting, support vector machine, Shapley values, local interpretable model agnostic explanations

Procedia PDF Downloads 121
1061 Simulation of the FDA Centrifugal Blood Pump Using High Performance Computing

Authors: Mehdi Behbahani, Sebastian Rible, Charles Moulinec, Yvan Fournier, Mike Nicolai, Paolo Crosetto

Abstract:

Computational Fluid Dynamics blood-flow simulations are increasingly used to develop and validate blood-contacting medical devices. This study shows that numerical simulations can provide additional and accurate estimates of relevant hemodynamic indicators (e.g., recirculation zones or wall shear stresses), which may be difficult and expensive to obtain from in-vivo or in-vitro experiments. The most recent FDA (Food and Drug Administration) benchmark consisted of a simplified centrifugal blood pump model that contains fluid flow features as they are commonly found in these devices with a clear focus on highly turbulent phenomena. The FDA centrifugal blood pump study is composed of six test cases with different volumetric flow rates ranging from 2.5 to 7.0 liters per minute, pump speeds, and Reynolds numbers ranging from 210,000 to 293,000. Within the frame of this study different turbulence models were tested including RANS models, e.g. k-omega, k-epsilon and a Reynolds Stress Model (RSM) and, LES. The partitioners Hilbert, METIS, ParMETIS and SCOTCH were used to create an unstructured mesh of 76 million elements and compared in their efficiency. Computations were performed on the JUQUEEN BG/Q architecture applying the highly parallel flow solver Code SATURNE and typically using 32768 or more processors in parallel. Visualisations were performed by means of PARAVIEW. Different turbulence models including all six flow situations could be successfully analysed and validated against analytical considerations and from comparison to other data-bases. It showed that an RSM represents an appropriate choice with respect to modeling high-Reynolds number flow cases. Especially, the Rij-SSG (Speziale, Sarkar, Gatzki) variant turned out to be a good approach. Visualisation of complex flow features could be obtained and the flow situation inside the pump could be characterized.

Keywords: blood flow, centrifugal blood pump, high performance computing, scalability, turbulence

Procedia PDF Downloads 365
1060 Air Pollutants Exposure and Blood High Sensitivity C-Reactive Protein Concentrations in Healthy Pregnant Women

Authors: Gwo-Hwa Wan, Tai-Ho Hung, Fen-Fang Chung, Wan-Ying Lee, Hui-Ching Yang

Abstract:

Air pollutant exposure results in elevated concentrations of oxidative stress and inflammatory biomarkers in general populations. Increased concentrations of inflammatory biomarkers in pregnant women would be associated with preterm labor and low birth weight. To our best knowledge, the associations between air pollutants exposure and inflammation in pregnant women and fetuses are unknown, as well as their effects on fetal growth. This study aimed to evaluate the influences of outdoor air pollutants in northern Taiwan areas on the inflammatory biomarker (high sensitivity C-reactive protein, hs-CRP) concentration in the blood of healthy pregnant women and how the biomarker impacts fetal growth. In this study, 38 healthy pregnant women who are in their first trimester and live in northern Taiwan area were recruited from the Taipei Chang Gung Memorial Hospital. Personal characteristics and prenatal examination data (e.g., blood pressure) were obtained from recruited subjects. The concentrations of inflammatory mediators, hs-CRP, in the blood of healthy pregnant women were analyzed. Additionally, hourly data of air pollutants (PM10, SO2, NO2, O3, CO) concentrations were obtained from air quality monitoring stations in Taipei area, established by the Taiwan Environmental Protection Administration. The definition of lag 0 and lag 01 are the exposure to air pollutants on the day of blood withdrawal, and the average exposure to air pollutants one day before and on the day of blood withdrawal, respectively. The statistical analyses were conducted using SPSS software version 22.0 (SPSS, Inc., Chicago, IL, USA). This analytical result indicates that the healthy pregnant women aged between 28 and 42 years old. The body mass index before pregnancy averaged 21.51 (sd = 2.51) kg/m2. Around 90% of the pregnant women had never smoking habit, and 28.95% of them had allergic diseases. Approximately around 84% and 5.26% of the pregnant women worked at indoor and outdoor environments, respectively. The mean hematocrit level of the pregnant women was 37.10%, and the hemoglobin levels were ranged between 10.1 and 14.7 g/dL with 12.47 g/dL of mean value. The blood hs-CRP concentrations of healthy pregnant women in the first trimester ranged between 0.32 and 32.5 mg/L with 2.83 (sd = 5.69) mg/L of mean value. The blood hs-CRP concentrations were positively associated with ozone concentrations at lag 0-14 (r = 0.481, p = 0.017) in healthy pregnant women. Significant lag effects were identified in ozone at lag 0-14 with a positive excess concentration of blood hs-CRP.

Keywords: air pollutant, hs-CRP, pregnant woman, ozone, first trimester

Procedia PDF Downloads 236
1059 Promoting Effective Institutional Governance in Cameroon Higher Education: A Governance Equalizer Perspective

Authors: Jean Patrick Mve

Abstract:

The increasing quest for efficiency, accountability, and transparency has led to the implementation of massive governance reforms among higher education systems worldwide. This is causing many changes in the governance of higher education institutions. Governments over the world are trying to adopt business-like organizational strategies to enhance the performance of higher education institutions. This study explores the changes that have taken place in the Cameroonian higher education sector. It also attempts to draw a picture of the likely future of higher education governance and the actions to be taken for the promotion of institutional effectiveness among higher education institutions. The “governance equalizer” is used as an analytical tool to this end. It covers the five dimensions of the New Public Management (NPM), namely: state regulation, stakeholder guidance, academic self-governance, managerial self-governance, and competition. Qualitative data are used, including semi-structured interviews with key informants at the organizational level and other academic stakeholders, documents and archival data from the university and from the ministry of higher education. It has been found that state regulation among higher education institutions in Cameroon is excessively high, causing the institutional autonomy to be very low, especially at the level of financial management, staffing and promotion, and other internal administrative affairs; at the level of stakeholder guidance there is a higher degree of stakeholders consideration in the academic and research activities among universities, though the government’s interest to keep its hands in most management activities is still high; academic self-governance is also very weak as the assignment of academics is done more on the basis of political considerations than competence; there is no real managerial self-governance among higher education institutions due to the lack of institutional capacity and insufficient autonomy at the level of decision making; there is a plan to promote competition among universities but a real competitive environment is not yet put into place. The study concludes that the government’s policy should make state control more relaxed and concentrate on steering and supervision. As well, real institutional autonomy, professional competence building for top management and stakeholder participation should be considered to guarantee competition and institutional effectiveness.

Keywords: Cameroon higher education, effective institutional governance, governance equalizer, institutional autonomy, institutional effectiveness

Procedia PDF Downloads 125
1058 Use of Coconut Shell as a Replacement of Normal Aggregates in Rigid Pavements

Authors: Prakash Parasivamurthy, Vivek Rama Das, Ravikant Talluri, Veena Jawali

Abstract:

India ranks among third in the production of coconut besides Philippines and Indonesia. About 92% of the total production in the country is contributed from four southern states especially, Kerala (45.22%), Tamil Nadu (26.56%), Karnataka (10.85%), and Andhra Pradesh (8.93%). Other states, such as Goa, Maharashtra, Odisha, West Bengal, and those in the northeast (Tripura and Assam) account for the remaining 8.44%. The use of coconut shell as coarse aggregate in concrete has never been a usual practice in the industry, particularly in areas where light weight concrete is required for non-load bearing walls, non-structural floors, and strip footings. The high cost of conventional building materials is a major factor affecting construction delivery in India. In India, where abundant agricultural and industrial wastes are discharged, these wastes can be used as potential material or replacement material in the construction industry. This will have double the advantages viz., reduction in the cost of construction material and also as a means of disposal of wastes. Therefore, an attempt has been made in this study to utilize the coconut shell (CS) as coarse aggregate in rigid pavement. The present study was initiated with the characterization of materials by the basic material testing. The casted moulds are cured and tests are conducted for hardened concrete. The procedure is continued with determination of fck (Characteristic strength), E (Modulus of Elasticity) and µ (Poisson Value) by the test results obtained. For the analytical studies, rigid pavement was modeled by the KEN PAVE software, finite element software developed specially for road pavements and simultaneously design of rigid pavement was carried out with Indian standards. Results show that physical properties of CSAC (Coconut Shell Aggregate Concrete) with 10% replacement gives better results. The flexural strength of CSAC is found to increase by 4.25% as compared to control concrete. About 13 % reduction in pavement thickness is observed using optimum coconut shell.

Keywords: coconut shell, rigid pavement, modulus of elasticity, poison ratio

Procedia PDF Downloads 213
1057 Geospatial Analysis for Predicting Sinkhole Susceptibility in Greene County, Missouri

Authors: Shishay Kidanu, Abdullah Alhaj

Abstract:

Sinkholes in the karst terrain of Greene County, Missouri, pose significant geohazards, imposing challenges on construction and infrastructure development, with potential threats to lives and property. To address these issues, understanding the influencing factors and modeling sinkhole susceptibility is crucial for effective mitigation through strategic changes in land use planning and practices. This study utilizes geographic information system (GIS) software to collect and process diverse data, including topographic, geologic, hydrogeologic, and anthropogenic information. Nine key sinkhole influencing factors, ranging from slope characteristics to proximity to geological structures, were carefully analyzed. The Frequency Ratio method establishes relationships between attribute classes of these factors and sinkhole events, deriving class weights to indicate their relative importance. Weighted integration of these factors is accomplished using the Analytic Hierarchy Process (AHP) and the Weighted Linear Combination (WLC) method in a GIS environment, resulting in a comprehensive sinkhole susceptibility index (SSI) model for the study area. Employing Jenk's natural break classifier method, the SSI values are categorized into five distinct sinkhole susceptibility zones: very low, low, moderate, high, and very high. Validation of the model, conducted through the Area Under Curve (AUC) and Sinkhole Density Index (SDI) methods, demonstrates a robust correlation with sinkhole inventory data. The prediction rate curve yields an AUC value of 74%, indicating a 74% validation accuracy. The SDI result further supports the success of the sinkhole susceptibility model. This model offers reliable predictions for the future distribution of sinkholes, providing valuable insights for planners and engineers in the formulation of development plans and land-use strategies. Its application extends to enhancing preparedness and minimizing the impact of sinkhole-related geohazards on both infrastructure and the community.

Keywords: sinkhole, GIS, analytical hierarchy process, frequency ratio, susceptibility, Missouri

Procedia PDF Downloads 50
1056 Multimodal Content: Fostering Students’ Language and Communication Competences

Authors: Victoria L. Malakhova

Abstract:

The research is devoted to multimodal content and its effectiveness in developing students’ linguistic and intercultural communicative competences as an indefeasible constituent of their future professional activity. Description of multimodal content both as a linguistic and didactic phenomenon makes the study relevant. The objective of the article is the analysis of creolized texts and the effect they have on fostering higher education students’ skills and their productivity. The main methods used are linguistic text analysis, qualitative and quantitative methods, deduction, generalization. The author studies texts with full and partial creolization, their features and role in composing multimodal textual space. The main verbal and non-verbal markers and paralinguistic means that enhance the linguo-pragmatic potential of creolized texts are covered. To reveal the efficiency of multimodal content application in English teaching, the author conducts an experiment among both undergraduate students and teachers. This allows specifying main functions of creolized texts in the process of language learning, detecting ways of enhancing students’ competences, and increasing their motivation. The described stages of using creolized texts can serve as an algorithm for work with multimodal content in teaching English as a foreign language. The findings contribute to improving the efficiency of the academic process.

Keywords: creolized text, English language learning, higher education, language and communication competences, multimodal content

Procedia PDF Downloads 93
1055 Exploring Public Opinions Toward the Use of Generative Artificial Intelligence Chatbot in Higher Education: An Insight from Topic Modelling and Sentiment Analysis

Authors: Samer Muthana Sarsam, Abdul Samad Shibghatullah, Chit Su Mon, Abd Aziz Alias, Hosam Al-Samarraie

Abstract:

Generative Artificial Intelligence chatbots (GAI chatbots) have emerged as promising tools in various domains, including higher education. However, their specific role within the educational context and the level of legal support for their implementation remain unclear. Therefore, this study aims to investigate the role of Bard, a newly developed GAI chatbot, in higher education. To achieve this objective, English tweets were collected from Twitter's free streaming Application Programming Interface (API). The Latent Dirichlet Allocation (LDA) algorithm was applied to extract latent topics from the collected tweets. User sentiments, including disgust, surprise, sadness, anger, fear, joy, anticipation, and trust, as well as positive and negative sentiments, were extracted using the NRC Affect Intensity Lexicon and SentiStrength tools. This study explored the benefits, challenges, and future implications of integrating GAI chatbots in higher education. The findings shed light on the potential power of such tools, exemplified by Bard, in enhancing the learning process and providing support to students throughout their educational journey.

Keywords: generative artificial intelligence chatbots, bard, higher education, topic modelling, sentiment analysis

Procedia PDF Downloads 53
1054 Extraction and Quantification of Triclosan in Wastewater Samples Using Molecularly Imprinted Membrane Adsorbent

Authors: Siyabonga Aubrey Mhlongo, Linda Lunga Sibali, Phumlane Selby Mdluli, Peter Papoh Ndibewu, Kholofelo Clifford Malematja

Abstract:

This paper reports on the successful extraction and quantification of an antibacterial and antifungal agent present in some consumer products (Triclosan: C₁₂H₇Cl₃O₂)generally found in wastewater or effluents using molecularly imprinted membrane adsorbent (MIMs) followed by quantification and removal on a high-performance liquid chromatography (HPLC). Triclosan is an antibacterial and antifungal agent present in some consumer products like toothpaste, soaps, detergents, toys, and surgical cleaning treatments. The MIMs was fabricated usingpolyvinylidene fluoride (PVDF) polymer with selective micro composite particles known as molecularly imprinted polymers (MIPs)via a phase inversion by immersion precipitation technique. This resulted in an improved hydrophilicity and mechanical behaviour of the membranes. Wastewater samples were collected from the Umbogintwini Industrial Complex (UIC) (south coast of Durban, KwaZulu-Natal in South Africa). central UIC effluent treatment plant and pre-treated before analysis. Experimental parameters such as sample size, contact time, stirring speed were optimised. The resultant MIMs had an adsorption efficiency of 97% of TCS with reference to NIMs and bare membrane, which had 92%, 88%, respectively. The analytical method utilized in this review had limits of detection (LoD) and limits of quantification (LoQ) of 0.22, 0.71µgL-1 in wastewater effluent, respectively. The percentage recovery for the effluent samples was 68%. The detection of TCS was monitored for 10 consecutive days, where optimum TCS traces detected in the treated wastewater was 55.0μg/L inday 9 of the monitored days, while the lowest detected was 6.0μg/L. As the concentrations of analytefound in effluent water samples were not so diverse, this study suggested that MIMs could be the best potential adsorbent for the development and continuous progress in membrane technologyand environmental sciences, lending its capability to desalination.

Keywords: molecularly imprinted membrane, triclosan, phase inversion, wastewater

Procedia PDF Downloads 96
1053 Development and Validation of a Liquid Chromatographic Method for the Quantification of Related Substance in Gentamicin Drug Substances

Authors: Sofiqul Islam, V. Murugan, Prema Kumari, Hari

Abstract:

Gentamicin is a broad spectrum water-soluble aminoglycoside antibiotics produced by the fermentation process of microorganism known as Micromonospora purpurea. It is widely used for the treatment of infection caused by both gram positive and gram negative bacteria. Gentamicin consists of a mixture of aminoglycoside components like C1, C1a, C2a, and C2. The molecular structure of Gentamicin and its related substances showed that it has lack of presence of chromophore group in the molecule due to which the detection of such components were quite critical and challenging. In this study, a simple Reversed Phase-High Performance Liquid Chromatographic (RP-HPLC) method using ultraviolet (UV) detector was developed and validated for quantification of the related substances present in Gentamicin drug substances. The method was achieved by using Thermo Scientific Hypersil Gold analytical column (150 x 4.6 mm, 5 µm particle size) with isocratic elution composed of methanol: water: glacial acetic acid: sodium hexane sulfonate in the ratio 70:25:5:3 % v/v/v/w as a mobile phase at a flow rate of 0.5 mL/min, column temperature was maintained at 30 °C and detection wavelength of 330 nm. The four components of Gentamicin namely Gentamicin C1, C1a, C2a, and C2 were well separated along with the related substance present in Gentamicin. The Limit of Quantification (LOQ) values were found to be at 0.0075 mg/mL. The accuracy of the method was quite satisfactory in which the % recovery was resulted between 95-105% for the related substances. The correlation coefficient (≥ 0.995) shows the linearity response against concentration over the range of Limit of Quantification (LOQ). Precision studies showed the % Relative Standard Deviation (RSD) values less than 5% for its related substance. The method was validated in accordance with the International Conference of Harmonization (ICH) guideline with various parameters like system suitability, specificity, precision, linearity, accuracy, limit of quantification, and robustness. This proposed method was easy and suitable for use for the quantification of related substances in routine analysis of Gentamicin formulations.

Keywords: reversed phase-high performance liquid chromatographic (RP-HPLC), high performance liquid chromatography, gentamicin, isocratic, ultraviolet

Procedia PDF Downloads 141
1052 Interval Bilevel Linear Fractional Programming

Authors: F. Hamidi, N. Amiri, H. Mishmast Nehi

Abstract:

The Bilevel Programming (BP) model has been presented for a decision making process that consists of two decision makers in a hierarchical structure. In fact, BP is a model for a static two person game (the leader player in the upper level and the follower player in the lower level) wherein each player tries to optimize his/her personal objective function under dependent constraints; this game is sequential and non-cooperative. The decision making variables are divided between the two players and one’s choice affects the other’s benefit and choices. In other words, BP consists of two nested optimization problems with two objective functions (upper and lower) where the constraint region of the upper level problem is implicitly determined by the lower level problem. In real cases, the coefficients of an optimization problem may not be precise, i.e. they may be interval. In this paper we develop an algorithm for solving interval bilevel linear fractional programming problems. That is to say, bilevel problems in which both objective functions are linear fractional, the coefficients are interval and the common constraint region is a polyhedron. From the original problem, the best and the worst bilevel linear fractional problems have been derived and then, using the extended Charnes and Cooper transformation, each fractional problem can be reduced to a linear problem. Then we can find the best and the worst optimal values of the leader objective function by two algorithms.

Keywords: best and worst optimal solutions, bilevel programming, fractional, interval coefficients

Procedia PDF Downloads 420
1051 The Relationship between Functional Movement Screening Test and Prevalence of Musculoskeletal Disorders in Emergency Nurse and Emergency Medical Services Staff Shiraz, Iran, 2017

Authors: Akram Sadat Jafari Roodbandi, Alireza Choobineh, Nazanin Hosseini, Vafa Feyzi

Abstract:

Introduction: Physical fitness and optimum functional movement are essential for efficiently performing job tasks without fatigue and injury. Functional Movement Screening (FMS) tests are used in screening of athletes and military forces. Nurses and emergency medical staff are obliged to perform many physical activities such as transporting patients, CPR operations, etc. due to the nature of their jobs. This study aimed to assess relationship between FMS test score and the prevalence of musculoskeletal disorders (MSDs) in emergency nurses and emergency medical services (EMS) staff. Methods: 134 male and female emergency nurses and EMS technicians participated in this cross-sectional, descriptive-analytical study. After video tutorial and practical training of how to do FMS test, the participants carried out the test while they were wearing comfortable clothes. The final score of the FMS test ranges from 0 to 21. The score of 14 is considered weak in the functional movement base on FMS test protocol. In addition to the demographic data questionnaire, the Nordic musculoskeletal questionnaire was also completed for each participant. SPSS software was used for statistical analysis with a significance level of 0.05. Results: Totally, 49.3% (n=66) of the subjects were female. The mean age and work experience of the subjects were 35.3 ± 8.7 and 11.4 ± 7.7, respectively. The highest prevalence of MSDs was observed at the knee and lower back with 32.8% (n=44) and 23.1% (n=31), respectively. 26 (19.4%) health worker had FMS test score of 14 and less. The results of the Spearman correlation test showed that the FMS test score was significantly associated with MSDs (r=-0.419, p < 0.0001). It meant that MSDs increased with the decrease of the FMS test score. Age, sex, and MSDs were the remaining significant factors in linear regression logistic model with dependent variable of FMS test score. Conclusion: FMS test seems to be a usable screening tool in pre-employment and periodic medical tests for occupations that require physical fitness and optimum functional movements.

Keywords: functional movement, musculoskeletal disorders, health care worker, screening test

Procedia PDF Downloads 109
1050 Landslide Susceptibility Mapping Using Soft Computing in Amhara Saint

Authors: Semachew M. Kassa, Africa M Geremew, Tezera F. Azmatch, Nandyala Darga Kumar

Abstract:

Frequency ratio (FR) and analytical hierarchy process (AHP) methods are developed based on past landslide failure points to identify the landslide susceptibility mapping because landslides can seriously harm both the environment and society. However, it is still difficult to select the most efficient method and correctly identify the main driving factors for particular regions. In this study, we used fourteen landslide conditioning factors (LCFs) and five soft computing algorithms, including Random Forest (RF), Support Vector Machine (SVM), Logistic Regression (LR), Artificial Neural Network (ANN), and Naïve Bayes (NB), to predict the landslide susceptibility at 12.5 m spatial scale. The performance of the RF (F1-score: 0.88, AUC: 0.94), ANN (F1-score: 0.85, AUC: 0.92), and SVM (F1-score: 0.82, AUC: 0.86) methods was significantly better than the LR (F1-score: 0.75, AUC: 0.76) and NB (F1-score: 0.73, AUC: 0.75) method, according to the classification results based on inventory landslide points. The findings also showed that around 35% of the study region was made up of places with high and very high landslide risk (susceptibility greater than 0.5). The very high-risk locations were primarily found in the western and southeastern regions, and all five models showed good agreement and similar geographic distribution patterns in landslide susceptibility. The towns with the highest landslide risk include Amhara Saint Town's western part, the Northern part, and St. Gebreal Church villages, with mean susceptibility values greater than 0.5. However, rainfall, distance to road, and slope were typically among the top leading factors for most villages. The primary contributing factors to landslide vulnerability were slightly varied for the five models. Decision-makers and policy planners can use the information from our study to make informed decisions and establish policies. It also suggests that various places should take different safeguards to reduce or prevent serious damage from landslide events.

Keywords: artificial neural network, logistic regression, landslide susceptibility, naïve Bayes, random forest, support vector machine

Procedia PDF Downloads 44
1049 Deep Reinforcement Learning Approach for Optimal Control of Industrial Smart Grids

Authors: Niklas Panten, Eberhard Abele

Abstract:

This paper presents a novel approach for real-time and near-optimal control of industrial smart grids by deep reinforcement learning (DRL). To achieve highly energy-efficient factory systems, the energetic linkage of machines, technical building equipment and the building itself is desirable. However, the increased complexity of the interacting sub-systems, multiple time-variant target values and stochastic influences by the production environment, weather and energy markets make it difficult to efficiently control the energy production, storage and consumption in the hybrid industrial smart grids. The studied deep reinforcement learning approach allows to explore the solution space for proper control policies which minimize a cost function. The deep neural network of the DRL agent is based on a multilayer perceptron (MLP), Long Short-Term Memory (LSTM) and convolutional layers. The agent is trained within multiple Modelica-based factory simulation environments by the Advantage Actor Critic algorithm (A2C). The DRL controller is evaluated by means of the simulation and then compared to a conventional, rule-based approach. Finally, the results indicate that the DRL approach is able to improve the control performance and significantly reduce energy respectively operating costs of industrial smart grids.

Keywords: industrial smart grids, energy efficiency, deep reinforcement learning, optimal control

Procedia PDF Downloads 169
1048 Production of Pre-Reduction of Iron Ore Nuggets with Lesser Sulphur Intake by Devolatisation of Boiler Grade Coal

Authors: Chanchal Biswas, Anrin Bhattacharyya, Gopes Chandra Das, Mahua Ghosh Chaudhuri, Rajib Dey

Abstract:

Boiler coals with low fixed carbon and higher ash content have always challenged the metallurgists to develop a suitable method for their utilization. In the present study, an attempt is made to establish an energy effective method for the reduction of iron ore fines in the form of nuggets by using ‘Syngas’. By devolatisation (expulsion of volatile matter by applying heat) of boiler coal, gaseous product (enriched with reducing agents like CO, CO2, H2, and CH4 gases) is generated. Iron ore nuggets are reduced by this syngas. For that reason, there is no direct contact between iron ore nuggets and coal ash. It helps to control the minimization of the sulphur intake of the reduced nuggets. A laboratory scale devolatisation furnace designed with reduction facility is evaluated after in-depth studies and exhaustive experimentations including thermo-gravimetric (TG-DTA) analysis to find out the volatile fraction present in boiler grade coal, gas chromatography (GC) to find out syngas composition in different temperature and furnace temperature gradient measurements to minimize the furnace cost by applying one heating coil. The nuggets are reduced in the devolatisation furnace at three different temperatures and three different times. The pre-reduced nuggets are subjected to analytical weight loss calculations to evaluate the extent of reduction. The phase and surface morphology analysis of pre-reduced samples are characterized using X-ray diffractometry (XRD), energy dispersive x-ray spectrometry (EDX), scanning electron microscopy (SEM), carbon sulphur analyzer and chemical analysis method. Degree of metallization of the reduced nuggets is 78.9% by using boiler grade coal. The pre-reduced nuggets with lesser sulphur content could be used in the blast furnace as raw materials or coolant which would reduce the high quality of coke rate of the furnace due to its pre-reduced character. These can be used in Basic Oxygen Furnace (BOF) as coolant also.

Keywords: alternative ironmaking, coal gasification, extent of reduction, nugget making, syngas based DRI, solid state reduction

Procedia PDF Downloads 244
1047 Intelligent Recognition of Diabetes Disease via FCM Based Attribute Weighting

Authors: Kemal Polat

Abstract:

In this paper, an attribute weighting method called fuzzy C-means clustering based attribute weighting (FCMAW) for classification of Diabetes disease dataset has been used. The aims of this study are to reduce the variance within attributes of diabetes dataset and to improve the classification accuracy of classifier algorithm transforming from non-linear separable datasets to linearly separable datasets. Pima Indians Diabetes dataset has two classes including normal subjects (500 instances) and diabetes subjects (268 instances). Fuzzy C-means clustering is an improved version of K-means clustering method and is one of most used clustering methods in data mining and machine learning applications. In this study, as the first stage, fuzzy C-means clustering process has been used for finding the centers of attributes in Pima Indians diabetes dataset and then weighted the dataset according to the ratios of the means of attributes to centers of theirs. Secondly, after weighting process, the classifier algorithms including support vector machine (SVM) and k-NN (k- nearest neighbor) classifiers have been used for classifying weighted Pima Indians diabetes dataset. Experimental results show that the proposed attribute weighting method (FCMAW) has obtained very promising results in the classification of Pima Indians diabetes dataset.

Keywords: fuzzy C-means clustering, fuzzy C-means clustering based attribute weighting, Pima Indians diabetes, SVM

Procedia PDF Downloads 387
1046 The Role of Institutional Quality and Institutional Quality Distance on Trade: The Case of Agricultural Trade within the Southern African Development Community Region

Authors: Kgolagano Mpejane

Abstract:

The study applies a New Institutional Economics (NIE) analytical framework to trade in developing economies by assessing the impacts of institutional quality and institutional quality distance on agricultural trade using a panel data of 15 Southern African Development Community (SADC) countries from the years 1991-2010. The issue of institutions on agricultural trade has not been accorded the necessary attention in the literature, particularly in developing economies. Therefore, the paper empirically tests the gravity model of international trade by measuring the impact of political, economic and legal institutions on intra SADC agricultural trade. The gravity model is noted for its exploratory power and strong theoretical foundation. However, the model has statistical shortcomings in dealing with zero trade values and heteroscedasticity residuals leading to biased results. Therefore, this study employs a two stage Heckman selection model with a Probit equation to estimate the influence of institutions on agricultural trade. The selection stages include the inverse Mills ratio to account for the variable bias of the gravity model. The Heckman model accounts for zero trade values and is robust in the presence of heteroscedasticity. The empirical results of the study support the NIE theory premise that institutions matter in trade. The results demonstrate that institutions determine bilateral agricultural trade on different margins with political institutions having positive and significant influence on bilateral agricultural trade flows within the SADC region. Legal and economic institutions have significant and negative effects on SADC trade. Furthermore, the results of this study confirm that institutional quality distance influences agricultural trade. Legal and political institutional distance have a positive and significant influence on bilateral agricultural trade while the influence of economic, institutional quality is negative and insignificant. The results imply that nontrade barriers, in the form of institutional quality and institutional quality distance, are significant factors limiting intra SADC agricultural trade. Therefore, gains from intra SADC agricultural trade can be attained through the improvement of institutions within the region.

Keywords: agricultural trade, institutions, gravity model, SADC

Procedia PDF Downloads 132
1045 Real-Time Multi-Vehicle Tracking Application at Intersections Based on Feature Selection in Combination with Color Attribution

Authors: Qiang Zhang, Xiaojian Hu

Abstract:

In multi-vehicle tracking, based on feature selection, the tracking system efficiently tracks vehicles in a video with minimal error in combination with color attribution, which focuses on presenting a simple and fast, yet accurate and robust solution to the problem such as inaccurately and untimely responses of statistics-based adaptive traffic control system in the intersection scenario. In this study, a real-time tracking system is proposed for multi-vehicle tracking in the intersection scene. Considering the complexity and application feasibility of the algorithm, in the object detection step, the detection result provided by virtual loops were post-processed and then used as the input for the tracker. For the tracker, lightweight methods were designed to extract and select features and incorporate them into the adaptive color tracking (ACT) framework. And the approbatory online feature selection algorithms are integrated on the mature ACT system with good compatibility. The proposed feature selection methods and multi-vehicle tracking method are evaluated on KITTI datasets and show efficient vehicle tracking performance when compared to the other state-of-the-art approaches in the same category. And the system performs excellently on the video sequences recorded at the intersection. Furthermore, the presented vehicle tracking system is suitable for surveillance applications.

Keywords: real-time, multi-vehicle tracking, feature selection, color attribution

Procedia PDF Downloads 132