Search results for: bandpass filtering
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 353

Search results for: bandpass filtering

113 Modeling of a UAV Longitudinal Dynamics through System Identification Technique

Authors: Asadullah I. Qazi, Mansoor Ahsan, Zahir Ashraf, Uzair Ahmad

Abstract:

System identification of an Unmanned Aerial Vehicle (UAV), to acquire its mathematical model, is a significant step in the process of aircraft flight automation. The need for reliable mathematical model is an established requirement for autopilot design, flight simulator development, aircraft performance appraisal, analysis of aircraft modifications, preflight testing of prototype aircraft and investigation of fatigue life and stress distribution etc.  This research is aimed at system identification of a fixed wing UAV by means of specifically designed flight experiment. The purposely designed flight maneuvers were performed on the UAV and aircraft states were recorded during these flights. Acquired data were preprocessed for noise filtering and bias removal followed by parameter estimation of longitudinal dynamics transfer functions using MATLAB system identification toolbox. Black box identification based transfer function models, in response to elevator and throttle inputs, were estimated using least square error   technique. The identification results show a high confidence level and goodness of fit between the estimated model and actual aircraft response.

Keywords: fixed wing UAV, system identification, black box modeling, longitudinal dynamics, least square error

Procedia PDF Downloads 292
112 Video Text Information Detection and Localization in Lecture Videos Using Moments

Authors: Belkacem Soundes, Guezouli Larbi

Abstract:

This paper presents a robust and accurate method for text detection and localization over lecture videos. Frame regions are classified into text or background based on visual feature analysis. However, lecture video shows significant degradation mainly related to acquisition conditions, camera motion and environmental changes resulting in low quality videos. Hence, affecting feature extraction and description efficiency. Moreover, traditional text detection methods cannot be directly applied to lecture videos. Therefore, robust feature extraction methods dedicated to this specific video genre are required for robust and accurate text detection and extraction. Method consists of a three-step process: Slide region detection and segmentation; Feature extraction and non-text filtering. For robust and effective features extraction moment functions are used. Two distinct types of moments are used: orthogonal and non-orthogonal. For orthogonal Zernike Moments, both Pseudo Zernike moments are used, whereas for non-orthogonal ones Hu moments are used. Expressivity and description efficiency are given and discussed. Proposed approach shows that in general, orthogonal moments show high accuracy in comparison to the non-orthogonal one. Pseudo Zernike moments are more effective than Zernike with better computation time.

Keywords: text detection, text localization, lecture videos, pseudo zernike moments

Procedia PDF Downloads 122
111 Robust Medical Image Watermarking based on Contourlet and Extraction Using ICA

Authors: S. Saju, G. Thirugnanam

Abstract:

In this paper, a medical image watermarking algorithm based on contourlet is proposed. Medical image watermarking is a special subcategory of image watermarking in the sense that images have special requirements. Watermarked medical images should not differ perceptually from their original counterparts because clinical reading of images must not be affected. Watermarking techniques based on wavelet transform are reported in many literatures but robustness and security using contourlet are better when compared to wavelet transform. The main challenge in exploring geometry in images comes from the discrete nature of the data. In this paper, original image is decomposed to two level using contourlet and the watermark is embedded in the resultant sub-bands. Sub-band selection is based on the value of Peak Signal to Noise Ratio (PSNR) that is calculated between watermarked and original image. To extract the watermark, Kernel ICA is used and it has a novel characteristic is that it does not require the transformation process to extract the watermark. Simulation results show that proposed scheme is robust against attacks such as Salt and Pepper noise, Median filtering and rotation. The performance measures like PSNR and Similarity measure are evaluated and compared with Discrete Wavelet Transform (DWT) to prove the robustness of the scheme. Simulations are carried out using Matlab Software.

Keywords: digital watermarking, independent component analysis, wavelet transform, contourlet

Procedia PDF Downloads 503
110 Semiautomatic Calculation of Ejection Fraction Using Echocardiographic Image Processing

Authors: Diana Pombo, Maria Loaiza, Mauricio Quijano, Alberto Cadena, Juan Pablo Tello

Abstract:

In this paper, we present a semi-automatic tool for calculating ejection fraction from an echocardiographic video signal which is derived from a database in DICOM format, of Clinica de la Costa - Barranquilla. Described in this paper are each of the steps and methods used to find the respective calculation that includes acquisition and formation of the test samples, processing and finally the calculation of the parameters to obtain the ejection fraction. Two imaging segmentation methods were compared following a methodological framework that is similar only in the initial stages of processing (process of filtering and image enhancement) and differ in the end when algorithms are implemented (Active Contour and Region Growing Algorithms). The results were compared with the measurements obtained by two different medical specialists in cardiology who calculated the ejection fraction of the study samples using the traditional method, which consists of drawing the region of interest directly from the computer using echocardiography equipment and a simple equation to calculate the desired value. The results showed that if the quality of video samples are good (i.e., after the pre-processing there is evidence of an improvement in the contrast), the values provided by the tool are substantially close to those reported by physicians; also the correlation between physicians does not vary significantly.

Keywords: echocardiography, DICOM, processing, segmentation, EDV, ESV, ejection fraction

Procedia PDF Downloads 403
109 A Combination of Filtration and Coagulation Processes for Tannery Effluent Treatment

Authors: M. G. Mostafa, Manjushree Chowdhury, Tapan Kumar Biswas, , Ananda Kumar Saha

Abstract:

This study focused on effluents characterization and treatment process to reduce of toxicity from tannery effluents. Tanning industry is one of the oldest industries in the world. It is typically characterized as pollutants generated industries which produce wide varieties of high strength toxic chemicals. The study was conducted during the year 2008 to 2009 and the tannery effluents were collected three times in a year from the outlet of some selected leather industries located in Hagaribagh industrial zone Dhaka, Bangladesh. The analysis results of the raw effluents reveal that the effluents were yellowish-brown color, having basic pH, very high value of BOD5¬¬, COD, TDS, TSS, TS, and high concentrations of Cr, Na, SO42-, Cl- and other organic and inorganic constituents. The tannery effluents were treated with various doses of FeCl3 after settling and a subsequent filtration through sand-stone. The study observed that coagulant (FeCl3) 150 mg/L dose around neutral pH showed the best removal efficiency for major physico-chemical parameters. The analysis results of illustrate that the most of the physical and chemical parameters were found well below the prescribed permissible limits for effluent discharged. The study suggests that tannery effluents could be treated by a combined process consisting of settling, filtering and coagulating with FeCl3.

Keywords: characterization, effluent, tannery, treatment

Procedia PDF Downloads 423
108 Investigation of the Litho-Structure of Ilesa Using High Resolution Aeromagnetic Data

Authors: Oladejo Olagoke Peter, Adagunodo T. A., Ogunkoya C. O.

Abstract:

The research investigated the arrangement of some geological features under Ilesa employing aeromagnetic data. The obtained data was subjected to various data filtering and processing techniques, which are Total Horizontal Derivative (THD), Depth Continuation and Analytical Signal Amplitude using Geosoft Oasis Montaj 6.4.2 software. The Reduced to the Equator –Total Magnetic Intensity (TRE-TMI) outcomes reveal significant magnetic anomalies, with high magnitude (55.1 to 155 nT) predominantly at the Northwest half of the area. Intermediate magnetic susceptibility, ranging between 6.0 to 55.1 nT, dominates the eastern part, separated by depressions and uplifts. The southern part of the area exhibits a magnetic field of low intensity, ranging from -76.6 to 6.0 nT. The lineaments exhibit varying lengths ranging from 2.5 and 16.0 km. Analyzing the Rose Diagram and the analytical signal amplitude indicates structural styles mainly of E-W and NE-SW orientations, particularly evident in the western, SW and NE regions with an amplitude of 0.0318nT/m. The identified faults in the area demonstrate orientations of NNW-SSE, NNE-SSW and WNW-ESE, situated at depths ranging from 500 to 750 m. Considering the divergence magnetic susceptibility, structural style or orientation of the lineaments, identified fault and their depth, these lithological features could serve as a valuable foundation for assessing ground motion, particularly in the presence of sufficient seismic energy.

Keywords: lineament, aeromagnetic, anomaly, fault, magnetic

Procedia PDF Downloads 41
107 An Enhanced SAR-Based Tsunami Detection System

Authors: Jean-Pierre Dubois, Jihad S. Daba, H. Karam, J. Abdallah

Abstract:

Tsunami early detection and warning systems have proved to be of ultimate importance, especially after the destructive tsunami that hit Japan in March 2012. Such systems are crucial to inform the authorities of any risk of a tsunami and of the degree of its danger in order to make the right decision and notify the public of the actions they need to take to save their lives. The purpose of this research is to enhance existing tsunami detection and warning systems. We first propose an automated and miniaturized model of an early tsunami detection and warning system. The model for the operation of a tsunami warning system is simulated using the data acquisition toolbox of Matlab and measurements acquired from specified internet pages due to the lack of the required real-life sensors, both seismic and hydrologic, and building a graphical user interface for the system. In the second phase of this work, we implement various satellite image filtering schemes to enhance the acquired synthetic aperture radar images of the tsunami affected region that are masked by speckle noise. This enables us to conduct a post-tsunami damage extent study and calculate the percentage damage. We conclude by proposing improvements to the existing telecommunication infrastructure of existing warning tsunami systems using a migration to IP-based networks and fiber optics links.

Keywords: detection, GIS, GSN, GTS, GPS, speckle noise, synthetic aperture radar, tsunami, wiener filter

Procedia PDF Downloads 356
106 Intelligent Chatbot Generating Dynamic Responses Through Natural Language Processing

Authors: Aarnav Singh, Jatin Moolchandani

Abstract:

The proposed research work aims to build a query-based AI chatbot that can answer any question related to any topic. A chatbot is software that converses with users via text messages. In the proposed system, we aim to build a chatbot that generates a response based on the user’s query. For this, we use natural language processing to analyze the query and some set of texts to form a concise answer. The texts are obtained through web-scrapping and filtering all the credible sources from a web search. The objective of this project is to provide a chatbot that is able to provide simple and accurate answers without the user having to read through a large number of articles and websites. Creating an AI chatbot that can answer a variety of user questions on a variety of topics is the goal of the proposed research project. This chatbot uses natural language processing to comprehend user inquiries and provides succinct responses by examining a collection of writings that were scraped from the internet. The texts are carefully selected from reliable websites that are found via internet searches. This project aims to provide users with a chatbot that provides clear and precise responses, removing the need to go through several articles and web pages in great detail. In addition to exploring the reasons for their broad acceptance and their usefulness across many industries, this article offers an overview of the interest in chatbots throughout the world.

Keywords: Chatbot, Artificial Intelligence, natural language processing, web scrapping

Procedia PDF Downloads 33
105 Case Study of High-Resolution Marine Seismic Survey in Shallow Water, Arabian Gulf, Saudi Arabia

Authors: Almalki M., Alajmi M., Qadrouh Y., Alzahrani E., Sulaiman A., Aleid M., Albaiji A., Alfaifi H., Alhadadi A., Almotairy H., Alrasheed R., Alhafedh Y.

Abstract:

High-resolution marine seismic survey is a well-established technique that commonly used to characterize near-surface sediments and geological structures at shallow water. We conduct single channel seismic survey to provide high quality seismic images for near-surface sediments upto 100m depth at Jubal costal area, Arabian Gulf. Eight hydrophones streamer has been used to collect stacked seismic traces alone 5km seismic line. To reach the required depth, we have used spark system that discharges energies above 5000 J with expected frequency output span the range from 200 to 2000 Hz. A suitable processing flow implemented to enhance the signal-to-noise ratio of the seismic profile. We have found that shallow sedimentary layers at the study site have complex pattern of reflectivity, which decay significantly due to amount of source energy used as well as the multiples associated to seafloor. In fact, the results reveal that single channel marine seismic at shallow water is a cost-effective technique that can be easily repeated to observe any possibly changes in the wave physical properties at the near surface layers

Keywords: shallow marine single-channel data, high resolution, frequency filtering, shallow water

Procedia PDF Downloads 47
104 A Posteriori Trading-Inspired Model-Free Time Series Segmentation

Authors: Plessen Mogens Graf

Abstract:

Within the context of multivariate time series segmentation, this paper proposes a method inspired by a posteriori optimal trading. After a normalization step, time series are treated channelwise as surrogate stock prices that can be traded optimally a posteriori in a virtual portfolio holding either stock or cash. Linear transaction costs are interpreted as hyperparameters for noise filtering. Trading signals, as well as trading signals obtained on the reversed time series, are used for unsupervised channelwise labeling before a consensus over all channels is reached that determines the final segmentation time instants. The method is model-free such that no model prescriptions for segments are made. Benefits of proposed approach include simplicity, computational efficiency, and adaptability to a wide range of different shapes of time series. Performance is demonstrated on synthetic and real-world data, including a large-scale dataset comprising a multivariate time series of dimension 1000 and length 2709. Proposed method is compared to a popular model-based bottom-up approach fitting piecewise affine models and to a recent model-based top-down approach fitting Gaussian models and found to be consistently faster while producing more intuitive results in the sense of segmenting time series at peaks and valleys.

Keywords: time series segmentation, model-free, trading-inspired, multivariate data

Procedia PDF Downloads 107
103 Magnetic Survey for the Delineation of Concrete Pillars in Geotechnical Investigation for Site Characterization

Authors: Nuraddeen Usman, Khiruddin Abdullah, Mohd Nawawi, Amin Khalil Ismail

Abstract:

A magnetic survey is carried out in order to locate the remains of construction items, specifically concrete pillars. The conventional Euler deconvolution technique can perform the task but it requires the use of fixed structural index (SI) and the construction items are made of materials with different shapes which require different SI (unknown). A Euler deconvolution technique that estimate background, horizontal coordinate (xo and yo), depth and structural index (SI) simultaneously is prepared and used for this task. The synthetic model study carried indicated the new methodology can give a good estimate of location and does not depend on magnetic latitude. For field data, both the total magnetic field and gradiometer reading had been collected simultaneously. The computed vertical derivatives and gradiometer readings are compared and they have shown good correlation signifying the effectiveness of the method. The filtering is carried out using automated procedure, analytic signal and other traditional techniques. The clustered depth solutions coincided with the high amplitude/values of analytic signal and these are the possible target positions of the concrete pillars being sought. The targets under investigation are interpreted to be located at the depth between 2.8 to 9.4 meters. More follow up survey is recommended as this mark the preliminary stage of the work.

Keywords: concrete pillar, magnetic survey, geotechnical investigation, Euler Deconvolution

Procedia PDF Downloads 234
102 Bayesian Inference for High Dimensional Dynamic Spatio-Temporal Models

Authors: Sofia M. Karadimitriou, Kostas Triantafyllopoulos, Timothy Heaton

Abstract:

Reduced dimension Dynamic Spatio-Temporal Models (DSTMs) jointly describe the spatial and temporal evolution of a function observed subject to noise. A basic state space model is adopted for the discrete temporal variation, while a continuous autoregressive structure describes the continuous spatial evolution. Application of such a DSTM relies upon the pre-selection of a suitable reduced set of basic functions and this can present a challenge in practice. In this talk, we propose an online estimation method for high dimensional spatio-temporal data based upon DSTM and we attempt to resolve this issue by allowing the basis to adapt to the observed data. Specifically, we present a wavelet decomposition in order to obtain a parsimonious approximation of the spatial continuous process. This parsimony can be achieved by placing a Laplace prior distribution on the wavelet coefficients. The aim of using the Laplace prior, is to filter wavelet coefficients with low contribution, and thus achieve the dimension reduction with significant computation savings. We then propose a Hierarchical Bayesian State Space model, for the estimation of which we offer an appropriate particle filter algorithm. The proposed methodology is illustrated using real environmental data.

Keywords: multidimensional Laplace prior, particle filtering, spatio-temporal modelling, wavelets

Procedia PDF Downloads 401
101 A Palmprint Identification System Based Multi-Layer Perceptron

Authors: David P. Tantua, Abdulkader Helwan

Abstract:

Biometrics has been recently used for the human identification systems using the biological traits such as the fingerprints and iris scanning. Identification systems based biometrics show great efficiency and accuracy in such human identification applications. However, these types of systems are so far based on some image processing techniques only, which may decrease the efficiency of such applications. Thus, this paper aims to develop a human palmprint identification system using multi-layer perceptron neural network which has the capability to learn using a backpropagation learning algorithms. The developed system uses images obtained from a public database available on the internet (CASIA). The processing system is as follows: image filtering using median filter, image adjustment, image skeletonizing, edge detection using canny operator to extract features, clear unwanted components of the image. The second phase is to feed those processed images into a neural network classifier which will adaptively learn and create a class for each different image. 100 different images are used for training the system. Since this is an identification system, it should be tested with the same images. Therefore, the same 100 images are used for testing it, and any image out of the training set should be unrecognized. The experimental results shows that this developed system has a great accuracy 100% and it can be implemented in real life applications.

Keywords: biometrics, biological traits, multi-layer perceptron neural network, image skeletonizing, edge detection using canny operator

Procedia PDF Downloads 336
100 Carbon Skimming: Towards an Application to Summarise and Compare Embodied Carbon to Aid Early-Stage Decision Making

Authors: Rivindu Nethmin Bandara Menik Hitihamy Mudiyanselage, Matthias Hank Haeusler, Ben Doherty

Abstract:

Investors and clients in the Architectural, Engineering and Construction industry find it difficult to understand complex datasets and reports with little to no graphic representation. The stakeholders examined in this paper include designers, design clients and end-users. Communicating embodied carbon information graphically and concisely can aid with decision support early in a building's life cycle. It is essential to create a common visualisation approach as the level of knowledge about embodied carbon varies between stakeholders. The tool, designed in conjunction with Bates Smart, condenses Tally Life Cycle Assessment data to a carbon hot-spotting visualisation, highlighting the sections with the highest amounts of embodied carbon. This allows stakeholders at every stage of a given project to have a better understanding of the carbon implications with minimal effort. It further allows stakeholders to differentiate building elements by their carbon values, which enables the evaluation of the cost-effectiveness of the selected materials at an early stage. To examine and build a decision-support tool, an action-design research methodology of cycles of iterations was used along with precedents of embodied carbon visualising tools. Accordingly, the importance of visualisation and Building Information Modelling are also explored to understand the best format for relaying these results.

Keywords: embodied carbon, visualisation, summarisation, data filtering, early-stage decision-making, materiality

Procedia PDF Downloads 53
99 Hybrid Collaborative-Context Based Recommendations for Civil Affairs Operations

Authors: Patrick Cummings, Laura Cassani, Deirdre Kelliher

Abstract:

In this paper we present findings from a research effort to apply a hybrid collaborative-context approach for a system focused on Marine Corps civil affairs data collection, aggregation, and analysis called the Marine Civil Information Management System (MARCIMS). The goal of this effort is to provide operators with information to make sense of the interconnectedness of entities and relationships in their area of operation and discover existing data to support civil military operations. Our approach to build a recommendation engine was designed to overcome several technical challenges, including 1) ensuring models were robust to the relatively small amount of data collected by the Marine Corps civil affairs community; 2) finding methods to recommend novel data for which there are no interactions captured; and 3) overcoming confirmation bias by ensuring content was recommended that was relevant for the mission despite being obscure or less well known. We solve this by implementing a combination of collective matrix factorization (CMF) and graph-based random walks to provide recommendations to civil military operations users. We also present a method to resolve the challenge of computation complexity inherent from highly connected nodes through a precomputed process.

Keywords: Recommendation engine, collaborative filtering, context based recommendation, graph analysis, coverage, civil affairs operations, Marine Corps

Procedia PDF Downloads 100
98 Quasiperiodic Magnetic Chains as Spin Filters

Authors: Arunava Chakrabarti

Abstract:

A one-dimensional chain of magnetic atoms, representative of a quantum gas in an artificial quasi-periodic potential and modeled by the well-known Aubry-Andre function and its variants are studied in respect of its capability of working as a spin filter for arbitrary spins. The basic formulation is explained in terms of a perfectly periodic chain first, where it is shown that a definite correlation between the spin S of the incoming particles and the magnetic moment h of the substrate atoms can open up a gap in the energy spectrum. This is crucial for a spin filtering action. The simple one-dimensional chain is shown to be equivalent to a 2S+1 strand ladder network. This equivalence is exploited to work out the condition for the opening of gaps. The formulation is then applied for a one-dimensional chain with quasi-periodic variation in the site potentials, the magnetic moments and their orientations following an Aubry-Andre modulation and its variants. In addition, we show that a certain correlation between the system parameters can generate absolutely continuous bands in such systems populated by Bloch like extended wave functions only, signaling the possibility of a metal-insulator transition. This is a case of correlated disorder (a deterministic one), and the results provide a non-trivial variation to the famous Anderson localization problem. We have worked within a tight binding formalism and have presented explicit results for the spin half, spin one, three halves and spin five half particles incident on the magnetic chain to explain our scheme and the central results.

Keywords: Aubry-Andre model, correlated disorder, localization, spin filter

Procedia PDF Downloads 322
97 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)

Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton

Abstract:

Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.

Keywords: cold-start learning, expectation propagation, multi-armed bandits, Thompson Sampling, variational inference

Procedia PDF Downloads 81
96 Assessment of the Number of Damaged Buildings from a Flood Event Using Remote Sensing Technique

Authors: Jaturong Som-ard

Abstract:

The heavy rainfall from 3rd to 22th January 2017 had swamped much area of Ranot district in southern Thailand. Due to heavy rainfall, the district was flooded which had a lot of effects on economy and social loss. The major objective of this study is to detect flooding extent using Sentinel-1A data and identify a number of damaged buildings over there. The data were collected in two stages as pre-flooding and during flood event. Calibration, speckle filtering, geometric correction, and histogram thresholding were performed with the data, based on intensity spectral values to classify thematic maps. The maps were used to identify flooding extent using change detection, along with the buildings digitized and collected on JOSM desktop. The numbers of damaged buildings were counted within the flooding extent with respect to building data. The total flooded areas were observed as 181.45 sq.km. These areas were mostly occurred at Ban khao, Ranot, Takhria, and Phang Yang sub-districts, respectively. The Ban khao sub-district had more occurrence than the others because this area is located at lower altitude and close to Thale Noi and Thale Luang lakes than others. The numbers of damaged buildings were high in Khlong Daen (726 features), Tha Bon (645 features), and Ranot sub-district (604 features), respectively. The final flood extent map might be very useful for the plan, prevention and management of flood occurrence area. The map of building damage can be used for the quick response, recovery and mitigation to the affected areas for different concern organization.

Keywords: flooding extent, Sentinel-1A data, JOSM desktop, damaged buildings

Procedia PDF Downloads 162
95 Knowledge Management for Competitiveness and Performances in Higher Educational Institutes

Authors: Jeyarajan Sivapathasundram

Abstract:

Knowledge management has been recognised as an emerging factor for being competitive among institutions and performances in firms. As such, being recognised as knowledge rich institution, higher education institutes have to be recognised knowledge management based resources for achieving competitive advantages. Present research picked result out of postgraduate research conducted in knowledge management at non-state higher educational institutes of Sri Lanka. Besides, the present research aimed to discover knowledge management for competition and firm performances of higher educational institutes out of the result produced by the postgraduate study. Besides, the results are found in a pair that developed out of knowledge management practices and the reason behind the existence of the practices. As such, the present research has developed a filter to pick the pairs that satisfy its condition of competition and performance of the firm. As such, the pair, such as benchmarking is practised to be ethically competing through conducting courses. As the postgraduate research tested results of foreign researches in a qualitative paradigm, the finding of the present research are generalise fact for knowledge management for competitiveness and performances in higher educational institutes. Further, the presented research method used attributes which explain competition and performance in its filter to discover the pairs relevant to competition and performances. As such, the fact in regards to knowledge management for competition and performances in higher educational institutes are presented in the publication that the presentation is out of the generalised result. Therefore, knowledge management for competition and performance in higher educational institutes are generalised.

Keywords: competition in and among higher educational institutes, performances of higher educational institutes, noun based filtering, production out of generalisation of a research

Procedia PDF Downloads 111
94 Ecological Implication of Air Pollution From Quarrying and Stone Cutting Industries on Agriculture and Plant Biodiversity Around Quarry Sites in Mpape, Bwari Area Council, FCT, Abuja

Authors: Muhammed Rabiu, Moses S. Oluyomi, Joshua Olorundare

Abstract:

Quarry activities are important to modern day life and the socio-economic development of local communities. Unfortunately, this industry is usually associated with air pollution. To assess the impact of quarry dust on plant biodiversity and agriculture, PM2.5, PM10 and some meteorological parameters were measured using Gas analyzer, handheld thermometer and Multifunction Anemometer (PCE-EM 888) as well as taking a social survey. High amount of particulate matters that exceeded the international standard were recorded at the study locations which include the Julius Berger Quarry and 1km away from the quarry site which serve as the base for the farmlands. The correlation coefficient between the particulate matters with the meteorological parameters of the locations all show a strong relationship with temperature recording a stronger value of 0.952 and 0.931 for PM2.5 and PM10 respectively. Similarly, the coefficient of determination 0.906 and 0.866 shows that temperature has the highest meteorological percentage variation on PM2.5 and PM10. Furthermore, a notable negative impact of quarrying on plant biodiversity and local farm crops are also revealed based on respondents’ results where wide range of local plants were affected with Maize and Azadiracta indica (Neem) been the most with respondent of 31.5% and 27.5%. According to the obtained results, it is highly recommended to develop green belt surrounding the quarrying using pollutant-tolerant trees (usually with broad leaves) in order to restrict spreading of quarrying dust via intercepting, filtering and absorbing pollutants.

Keywords: agriculture, air pollution, biodiversity, quarry

Procedia PDF Downloads 43
93 Information Overload, Information Literacy and Use of Technology by Students

Authors: Elena Krelja Kurelović, Jasminka Tomljanović, Vlatka Davidović

Abstract:

The development of web technologies and mobile devices makes creating, accessing, using and sharing information or communicating with each other simpler every day. However, while the amount of information constantly increasing it is becoming harder to effectively organize and find quality information despite the availability of web search engines, filtering and indexing tools. Although digital technologies have overall positive impact on students’ lives, frequent use of these technologies and digital media enriched with dynamic hypertext and hypermedia content, as well as multitasking, distractions caused by notifications, calls or messages; can decrease the attention span, make thinking, memorizing and learning more difficult, which can lead to stress and mental exhaustion. This is referred to as “information overload”, “information glut” or “information anxiety”. Objective of this study is to determine whether students show signs of information overload and to identify the possible predictors. Research was conducted using a questionnaire developed for the purpose of this study. The results show that students frequently use technology (computers, gadgets and digital media), while they show moderate level of information literacy. They have sometimes experienced symptoms of information overload. According to the statistical analysis, higher frequency of technology use and lower level of information literacy are correlated with larger information overload. The multiple regression analysis has confirmed that the combination of these two independent variables has statistically significant predictive capacity for information overload. Therefore, the information science teachers should pay attention to improving the level of students’ information literacy and educate them about the risks of excessive technology use.

Keywords: information overload, computers, mobile devices, digital media, information literacy, students

Procedia PDF Downloads 253
92 Sources of Water Supply and Water Quality for Local Consumption: The Case Study of Eco-Tourism Village, Suan Luang Sub- District Municipality, Ampawa District, Samut Songkram Province, Thailand

Authors: Paiboon Jeamponk, Tasanee Ponglaa, Patchapon Srisanguan

Abstract:

The aim of this research paper was based on an examination of sources of water supply and water quality for local consumption, conducted at eco-tourism villages of Suan Luang Sub- District Municipality of Amphawa District, Samut Songkram Province. The study incorporated both questionnaire and field work of water testing as the research tool and method. The sample size of 288 households was based on the population of the district, whereas the selected sample water sources were from 60 households: 30 samples were ground water and another 30 were surface water. Degree of heavy metal contamination in the water including copper, iron, manganese, zinc, cadmium and lead was investigated utilizing the Atomic Absorption- Direct Aspiration method. The findings unveiled that 96.0 percent of household water consumption was based on water supply, while the rest on canal, river and rain water. The household behaviour of consumption revealed that 47.2 percent of people routinely consumed water without boiling or filtering prior to consumption. The investigation of water supply quality found that the degree of heavy metal contamination including metal, lead, iron, copper, manganese and cadmium met the standards of the Department of Health.

Keywords: sources of water supply, water quality, water supply, Thailand

Procedia PDF Downloads 263
91 Design of Digital IIR Filter Using Opposition Learning and Artificial Bee Colony Algorithm

Authors: J. S. Dhillon, K. K. Dhaliwal

Abstract:

In almost all the digital filtering applications the digital infinite impulse response (IIR) filters are preferred over finite impulse response (FIR) filters because they provide much better performance, less computational cost and have smaller memory requirements for similar magnitude specifications. However, the digital IIR filters are generally multimodal with respect to the filter coefficients and therefore, reliable methods that can provide global optimal solutions are required. The artificial bee colony (ABC) algorithm is one such recently introduced meta-heuristic optimization algorithm. But in some cases it shows insufficiency while searching the solution space resulting in a weak exchange of information and hence is not able to return better solutions. To overcome this deficiency, the opposition based learning strategy is incorporated in ABC and hence a modified version called oppositional artificial bee colony (OABC) algorithm is proposed in this paper. Duplication of members is avoided during the run which also augments the exploration ability. The developed algorithm is then applied for the design of optimal and stable digital IIR filter structure where design of low-pass (LP) and high-pass (HP) filters is carried out. Fuzzy theory is applied to achieve maximize satisfaction of minimum magnitude error and stability constraints. To check the effectiveness of OABC, the results are compared with some well established filter design techniques and it is observed that in most cases OABC returns better or atleast comparable results.

Keywords: digital infinite impulse response filter, artificial bee colony optimization, opposition based learning, digital filter design, multi-parameter optimization

Procedia PDF Downloads 445
90 Simulation of the Collimator Plug Design for Prompt-Gamma Activation Analysis in the IEA-R1 Nuclear Reactor

Authors: Carlos G. Santos, Frederico A. Genezini, A. P. Dos Santos, H. Yorivaz, P. T. D. Siqueira

Abstract:

The Prompt-Gamma Activation Analysis (PGAA) is a valuable technique for investigating the elemental composition of various samples. However, the installation of a PGAA system entails specific conditions such as filtering the neutron beam according to the target and providing adequate shielding for both users and detectors. These requirements incur substantial costs, exceeding $100,000, including manpower. Nevertheless, a cost-effective approach involves leveraging an existing neutron beam facility to create a hybrid system integrating PGAA and Neutron Tomography (NT). The IEA-R1 nuclear reactor at IPEN/USP possesses an NT facility with suitable conditions for adapting and implementing a PGAA device. The NT facility offers a thermal flux slightly colder and provides shielding for user protection. The key additional requirement involves designing detector shielding to mitigate high gamma ray background and safeguard the HPGe detector from neutron-induced damage. This study employs Monte Carlo simulations with the MCNP6 code to optimize the collimator plug for PGAA within the IEA-R1 NT facility. Three collimator models are proposed and simulated to assess their effectiveness in shielding gamma and neutron radiation from nucleon fission. The aim is to achieve a focused prompt-gamma signal while shielding ambient gamma radiation. The simulation results indicate that one of the proposed designs is particularly suitable for the PGAA-NT hybrid system.

Keywords: MCNP6.1, neutron, prompt-gamma ray, prompt-gamma activation analysis

Procedia PDF Downloads 35
89 A Systematic Literature Review on the Prevalence of Academic Plagiarism and Cheating in Higher Educational Institutions

Authors: Sozon, Pok Wei Fong, Sia Bee Chuan, Omar Hamdan Mohammad

Abstract:

Owing to the widespread phenomenon of plagiarism and cheating in higher education institutions (HEIs), it is now difficult to ensure academic integrity and quality education. Moreover, the COVID-19 pandemic has intensified the issue by shifting educational institutions into virtual teaching and assessment mode. Thus, there is a need to carry out an extensive and holistic systematic review of the literature to highlight plagiarism and cheating in both prevalence and form among HEIs. This paper systematically reviews the literature concerning academic plagiarism and cheating in HEIs to determine the most common forms and suggest strategies for resolution and boosting the academic integrity of students. The review included 45 articles and publications for the period from February 12, 2018, to September 12, 2022, in the Scopus database aligned with the Systematic Review and Meta-Analysis (PRISMA) guidelines in the selection, filtering, and reporting of the papers for review from which a conclusion can be drawn. Based on the results, out of the studies reviewed, 48% of the quantitative results of students were plagiarized and obtained through cheating, with 84% coming from the fields of Humanities. Moreover, Psychology and Social Sciences studies accumulated 9% and 7% articles respectively. Based on the results, individual factors, institutional factors, and social and cultural factors have contributed to plagiarism and cheating cases in HEIs. The resolution of this issue can be the establishment of ethical and moral development initiatives and modern academic policies and guidelines supported by technological strategies of testing.

Keywords: plagiarism, cheating, systematic review, academic integrity

Procedia PDF Downloads 27
88 Quantum Engine Proposal using Two-level Atom Like Manipulation and Relativistic Motoring Control

Authors: Montree Bunruangses, Sonath Bhattacharyya, Somchat Sonasang, Preecha Yupapin

Abstract:

A two-level system is manipulated by a microstrip add-drop circuit configured as an atom like system for wave-particle behavior investigation when its traveling speed along the circuit perimeter is the speed of light. The entangled pair formed by the upper and lower sideband peaks is bound by the angular displacement, which is given by 0≤θ≤π/2. The control signals associated with 3-peak signal frequencies are applied by the external inputs via the microstrip add-drop multiplexer ports, where they are time functions without the space term involved. When a system satisfies the speed of light conditions, the mass term has been changed to energy based on the relativistic limit described by the Lorentz factor and Einstein equation. The different applied frequencies can be utilized to form the 3-phase torques that can be applied for quantum engines. The experiment will use the two-level system circuit and be conducted in the laboratory. The 3-phase torques will be recorded and investigated for quantum engine driving purpose. The obtained results will be compared to the simulation. The optimum amplification of torque can be obtained by the resonant successive filtering operation. Torque will be vanished when the system is balanced at the stopped position, where |Time|=0, which is required to be a system stability condition. It will be discussed for future applications. A larger device may be tested in the future for realistic use. A synchronous and asynchronous driven motor is also discussed for the warp drive use.

Keywords: quantum engine, relativistic motor, 3-phase torque, atomic engine

Procedia PDF Downloads 33
87 Integrated Geotechnical and Geophysical Investigation of a Proposed Construction Site at Mowe, Southwestern Nigeria

Authors: Kayode Festus Oyedele, Sunday Oladele, Adaora Chibundu Nduka

Abstract:

The subsurface of a proposed site for building development in Mowe, Nigeria, using Standard Penetration Test (SPT) and Cone Penetrometer Test (CPT) supplemented with Horizontal Electrical Profiling (HEP) was investigated with the aim of evaluating the suitability of the strata for foundation materials. Four SPT and CPT were implemented using 10 tonnes hammer. HEP utilizing Wenner array were performed with inter-electrode spacing of 10 – 60 m along four traverses coincident with each of the SPT and CPT. The HEP data were processed using DIPRO software and textural filtering of the resulting resistivity sections was implemented to enable delineation of hidden layers. Sandy lateritic clay, silty lateritic clay, clay, clayey sand and sand horizons were delineated. The SPT “N” value defined very soft to soft sandy lateritic (<4), stiff silty lateritic clay (7 – 12), very stiff silty clay (12 - 15), clayey sand (15- 20) and sand (27 – 37). Sandy lateritic clay (5-40 kg/cm2) and silty lateritic clay (25 - 65 kg/cm2) were defined from the CPT response. Sandy lateritic clay (220-750 Ωm), clay (< 50 Ωm) and sand (415-5359 Ωm) were delineated from the resistivity sections with two thin layers of silty lateritic clay and clayey sand defined in the texturally filtered resistivity sections. This study concluded that the presence of incompetent thick clayey materials (18 m) beneath the study area makes it unsuitable for shallow foundation. Deep foundation involving piling through the clayey layers to the competent sand at 20 m depth was recommended.

Keywords: cone penetrometer, foundation, lithologic texture, resistivity section, standard penetration test

Procedia PDF Downloads 222
86 Development of a Laboratory Laser-Produced Plasma “Water Window” X-Ray Source for Radiobiology Experiments

Authors: Daniel Adjei, Mesfin Getachew Ayele, Przemyslaw Wachulak, Andrzej Bartnik, Luděk Vyšín, Henryk Fiedorowicz, Inam Ul Ahad, Lukasz Wegrzynski, Anna Wiechecka, Janusz Lekki, Wojciech M. Kwiatek

Abstract:

Laser produced plasma light sources, emitting high intensity pulses of X-rays, delivering high doses are useful to understand the mechanisms of high dose effects on biological samples. In this study, a desk-top laser plasma soft X-ray source, developed for radio biology research, is presented. The source is based on a double-stream gas puff target, irradiated with a commercial Nd:YAG laser (EKSPLA), which generates laser pulses of 4 ns time duration and energy up to 800 mJ at 10 Hz repetition rate. The source has been optimized for maximum emission in the “water window” wavelength range from 2.3 nm to 4.4 nm by using pure gas (argon, nitrogen and krypton) and spectral filtering. Results of the source characterization measurements and dosimetry of the produced soft X-ray radiation are shown and discussed. The high brightness of the laser produced plasma soft X-ray source and the low penetration depth of the produced X-ray radiation in biological specimen allows a high dose rate to be delivered to the specimen of over 28 Gy/shot; and 280 Gy/s at the maximum repetition rate of the laser system. The source has a unique capability for irradiation of cells with high pulse dose both in vacuum and He-environment. Demonstration of the source to induce DNA double- and single strand breaks will be discussed.

Keywords: laser produced plasma, soft X-rays, radio biology experiments, dosimetry

Procedia PDF Downloads 555
85 Noise Mitigation Techniques to Minimize Electromagnetic Interference/Electrostatic Discharge Effects for the Lunar Mission Spacecraft

Authors: Vabya Kumar Pandit, Mudit Mittal, N. Prahlad Rao, Ramnath Babu

Abstract:

TeamIndus is the only Indian team competing for the Google Lunar XPRIZE(GLXP). The GLXP is a global competition to challenge the private entities to soft land a rover on the moon, travel minimum 500 meters and transmit high definition images and videos to Earth. Towards this goal, the TeamIndus strategy is to design and developed lunar lander that will deliver a rover onto the surface of the moon which will accomplish GLXP mission objectives. This paper showcases the various system level noise control techniques adopted by Electrical Distribution System (EDS), to achieve the required Electromagnetic Compatibility (EMC) of the spacecraft. The design guidelines followed to control Electromagnetic Interference by proper electronic package design, grounding, shielding, filtering, and cable routing within the stipulated mass budget, are explained. The paper also deals with the challenges of achieving Electromagnetic Cleanliness in presence of various Commercial Off-The-Shelf (COTS) and In-House developed components. The methods of minimizing Electrostatic Discharge (ESD) by identifying the potential noise sources, susceptible areas for charge accumulation and the methodology to prevent arcing inside spacecraft are explained. The paper then provides the EMC requirements matrix derived from the mission requirements to meet the overall Electromagnetic compatibility of the Spacecraft.

Keywords: electromagnetic compatibility, electrostatic discharge, electrical distribution systems, grounding schemes, light weight harnessing

Procedia PDF Downloads 270
84 Identification of Spam Keywords Using Hierarchical Category in C2C E-Commerce

Authors: Shao Bo Cheng, Yong-Jin Han, Se Young Park, Seong-Bae Park

Abstract:

Consumer-to-Consumer (C2C) E-commerce has been growing at a very high speed in recent years. Since identical or nearly-same kinds of products compete one another by relying on keyword search in C2C E-commerce, some sellers describe their products with spam keywords that are popular but are not related to their products. Though such products get more chances to be retrieved and selected by consumers than those without spam keywords, the spam keywords mislead the consumers and waste their time. This problem has been reported in many commercial services like e-bay and taobao, but there have been little research to solve this problem. As a solution to this problem, this paper proposes a method to classify whether keywords of a product are spam or not. The proposed method assumes that a keyword for a given product is more reliable if the keyword is observed commonly in specifications of products which are the same or the same kind as the given product. This is because that a hierarchical category of a product in general determined precisely by a seller of the product and so is the specification of the product. Since higher layers of the hierarchical category represent more general kinds of products, a reliable degree is differently determined according to the layers. Hence, reliable degrees from different layers of a hierarchical category become features for keywords and they are used together with features only from specifications for classification of the keywords. Support Vector Machines are adopted as a basic classifier using the features, since it is powerful, and widely used in many classification tasks. In the experiments, the proposed method is evaluated with a golden standard dataset from Yi-han-wang, a Chinese C2C e-commerce, and is compared with a baseline method that does not consider the hierarchical category. The experimental results show that the proposed method outperforms the baseline in F1-measure, which proves that spam keywords are effectively identified by a hierarchical category in C2C e-commerce.

Keywords: spam keyword, e-commerce, keyword features, spam filtering

Procedia PDF Downloads 269