Search results for: Wiener filtering
97 Quantum Engine Proposal using Two-level Atom Like Manipulation and Relativistic Motoring Control
Authors: Montree Bunruangses, Sonath Bhattacharyya, Somchat Sonasang, Preecha Yupapin
Abstract:
A two-level system is manipulated by a microstrip add-drop circuit configured as an atom like system for wave-particle behavior investigation when its traveling speed along the circuit perimeter is the speed of light. The entangled pair formed by the upper and lower sideband peaks is bound by the angular displacement, which is given by 0≤θ≤π/2. The control signals associated with 3-peak signal frequencies are applied by the external inputs via the microstrip add-drop multiplexer ports, where they are time functions without the space term involved. When a system satisfies the speed of light conditions, the mass term has been changed to energy based on the relativistic limit described by the Lorentz factor and Einstein equation. The different applied frequencies can be utilized to form the 3-phase torques that can be applied for quantum engines. The experiment will use the two-level system circuit and be conducted in the laboratory. The 3-phase torques will be recorded and investigated for quantum engine driving purpose. The obtained results will be compared to the simulation. The optimum amplification of torque can be obtained by the resonant successive filtering operation. Torque will be vanished when the system is balanced at the stopped position, where |Time|=0, which is required to be a system stability condition. It will be discussed for future applications. A larger device may be tested in the future for realistic use. A synchronous and asynchronous driven motor is also discussed for the warp drive use.Keywords: quantum engine, relativistic motor, 3-phase torque, atomic engine
Procedia PDF Downloads 6396 Integrated Geotechnical and Geophysical Investigation of a Proposed Construction Site at Mowe, Southwestern Nigeria
Authors: Kayode Festus Oyedele, Sunday Oladele, Adaora Chibundu Nduka
Abstract:
The subsurface of a proposed site for building development in Mowe, Nigeria, using Standard Penetration Test (SPT) and Cone Penetrometer Test (CPT) supplemented with Horizontal Electrical Profiling (HEP) was investigated with the aim of evaluating the suitability of the strata for foundation materials. Four SPT and CPT were implemented using 10 tonnes hammer. HEP utilizing Wenner array were performed with inter-electrode spacing of 10 – 60 m along four traverses coincident with each of the SPT and CPT. The HEP data were processed using DIPRO software and textural filtering of the resulting resistivity sections was implemented to enable delineation of hidden layers. Sandy lateritic clay, silty lateritic clay, clay, clayey sand and sand horizons were delineated. The SPT “N” value defined very soft to soft sandy lateritic (<4), stiff silty lateritic clay (7 – 12), very stiff silty clay (12 - 15), clayey sand (15- 20) and sand (27 – 37). Sandy lateritic clay (5-40 kg/cm2) and silty lateritic clay (25 - 65 kg/cm2) were defined from the CPT response. Sandy lateritic clay (220-750 Ωm), clay (< 50 Ωm) and sand (415-5359 Ωm) were delineated from the resistivity sections with two thin layers of silty lateritic clay and clayey sand defined in the texturally filtered resistivity sections. This study concluded that the presence of incompetent thick clayey materials (18 m) beneath the study area makes it unsuitable for shallow foundation. Deep foundation involving piling through the clayey layers to the competent sand at 20 m depth was recommended.Keywords: cone penetrometer, foundation, lithologic texture, resistivity section, standard penetration test
Procedia PDF Downloads 26595 Development of a Laboratory Laser-Produced Plasma “Water Window” X-Ray Source for Radiobiology Experiments
Authors: Daniel Adjei, Mesfin Getachew Ayele, Przemyslaw Wachulak, Andrzej Bartnik, Luděk Vyšín, Henryk Fiedorowicz, Inam Ul Ahad, Lukasz Wegrzynski, Anna Wiechecka, Janusz Lekki, Wojciech M. Kwiatek
Abstract:
Laser produced plasma light sources, emitting high intensity pulses of X-rays, delivering high doses are useful to understand the mechanisms of high dose effects on biological samples. In this study, a desk-top laser plasma soft X-ray source, developed for radio biology research, is presented. The source is based on a double-stream gas puff target, irradiated with a commercial Nd:YAG laser (EKSPLA), which generates laser pulses of 4 ns time duration and energy up to 800 mJ at 10 Hz repetition rate. The source has been optimized for maximum emission in the “water window” wavelength range from 2.3 nm to 4.4 nm by using pure gas (argon, nitrogen and krypton) and spectral filtering. Results of the source characterization measurements and dosimetry of the produced soft X-ray radiation are shown and discussed. The high brightness of the laser produced plasma soft X-ray source and the low penetration depth of the produced X-ray radiation in biological specimen allows a high dose rate to be delivered to the specimen of over 28 Gy/shot; and 280 Gy/s at the maximum repetition rate of the laser system. The source has a unique capability for irradiation of cells with high pulse dose both in vacuum and He-environment. Demonstration of the source to induce DNA double- and single strand breaks will be discussed.Keywords: laser produced plasma, soft X-rays, radio biology experiments, dosimetry
Procedia PDF Downloads 58894 Noise Mitigation Techniques to Minimize Electromagnetic Interference/Electrostatic Discharge Effects for the Lunar Mission Spacecraft
Authors: Vabya Kumar Pandit, Mudit Mittal, N. Prahlad Rao, Ramnath Babu
Abstract:
TeamIndus is the only Indian team competing for the Google Lunar XPRIZE(GLXP). The GLXP is a global competition to challenge the private entities to soft land a rover on the moon, travel minimum 500 meters and transmit high definition images and videos to Earth. Towards this goal, the TeamIndus strategy is to design and developed lunar lander that will deliver a rover onto the surface of the moon which will accomplish GLXP mission objectives. This paper showcases the various system level noise control techniques adopted by Electrical Distribution System (EDS), to achieve the required Electromagnetic Compatibility (EMC) of the spacecraft. The design guidelines followed to control Electromagnetic Interference by proper electronic package design, grounding, shielding, filtering, and cable routing within the stipulated mass budget, are explained. The paper also deals with the challenges of achieving Electromagnetic Cleanliness in presence of various Commercial Off-The-Shelf (COTS) and In-House developed components. The methods of minimizing Electrostatic Discharge (ESD) by identifying the potential noise sources, susceptible areas for charge accumulation and the methodology to prevent arcing inside spacecraft are explained. The paper then provides the EMC requirements matrix derived from the mission requirements to meet the overall Electromagnetic compatibility of the Spacecraft.Keywords: electromagnetic compatibility, electrostatic discharge, electrical distribution systems, grounding schemes, light weight harnessing
Procedia PDF Downloads 29393 Identification of Spam Keywords Using Hierarchical Category in C2C E-Commerce
Authors: Shao Bo Cheng, Yong-Jin Han, Se Young Park, Seong-Bae Park
Abstract:
Consumer-to-Consumer (C2C) E-commerce has been growing at a very high speed in recent years. Since identical or nearly-same kinds of products compete one another by relying on keyword search in C2C E-commerce, some sellers describe their products with spam keywords that are popular but are not related to their products. Though such products get more chances to be retrieved and selected by consumers than those without spam keywords, the spam keywords mislead the consumers and waste their time. This problem has been reported in many commercial services like e-bay and taobao, but there have been little research to solve this problem. As a solution to this problem, this paper proposes a method to classify whether keywords of a product are spam or not. The proposed method assumes that a keyword for a given product is more reliable if the keyword is observed commonly in specifications of products which are the same or the same kind as the given product. This is because that a hierarchical category of a product in general determined precisely by a seller of the product and so is the specification of the product. Since higher layers of the hierarchical category represent more general kinds of products, a reliable degree is differently determined according to the layers. Hence, reliable degrees from different layers of a hierarchical category become features for keywords and they are used together with features only from specifications for classification of the keywords. Support Vector Machines are adopted as a basic classifier using the features, since it is powerful, and widely used in many classification tasks. In the experiments, the proposed method is evaluated with a golden standard dataset from Yi-han-wang, a Chinese C2C e-commerce, and is compared with a baseline method that does not consider the hierarchical category. The experimental results show that the proposed method outperforms the baseline in F1-measure, which proves that spam keywords are effectively identified by a hierarchical category in C2C e-commerce.Keywords: spam keyword, e-commerce, keyword features, spam filtering
Procedia PDF Downloads 29492 Implementation of a Monostatic Microwave Imaging System using a UWB Vivaldi Antenna
Authors: Babatunde Olatujoye, Binbin Yang
Abstract:
Microwave imaging is a portable, noninvasive, and non-ionizing imaging technique that employs low-power microwave signals to reveal objects in the microwave frequency range. This technique has immense potential for adoption in commercial and scientific applications such as security scanning, material characterization, and nondestructive testing. This work presents a monostatic microwave imaging setup using an Ultra-Wideband (UWB), low-cost, miniaturized Vivaldi antenna with a bandwidth of 1 – 6 GHz. The backscattered signals (S-parameters) of the Vivaldi antenna used for scanning targets were measured in the lab using a VNA. An automated two-dimensional (2-D) scanner was employed for the 2-D movement of the transceiver to collect the measured scattering data from different positions. The targets consist of four metallic objects, each with a distinct shape. Similar setup was also simulated in Ansys HFSS. A high-resolution Back Propagation Algorithm (BPA) was applied to both the simulated and experimental backscattered signals. The BPA utilizes the phase and amplitude information recorded over a two-dimensional aperture of 50 cm × 50 cm with a discreet step size of 2 cm to reconstruct a focused image of the targets. The adoption of BPA was demonstrated by coherently resolving and reconstructing reflection signals from conventional time-of-flight profiles. For both the simulation and experimental data, BPA accurately reconstructed a high resolution 2D image of the targets in terms of shape and location. An improvement of the BPA, in terms of target resolution, was achieved by applying the filtering method in frequency domain.Keywords: back propagation, microwave imaging, monostatic, vivialdi antenna, ultra wideband
Procedia PDF Downloads 2091 Creating Risk Maps on the Spatiotemporal Occurrence of Agricultural Insecticides in Sub-Saharan Africa
Authors: Chantal Hendriks, Harry Gibson, Anna Trett, Penny Hancock, Catherine Moyes
Abstract:
The use of modern inputs for crop protection, such as insecticides, is strongly underestimated in Sub-Saharan Africa. Several studies measured toxic concentrations of insecticides in fruits, vegetables and fish that were cultivated in Sub-Saharan Africa. The use of agricultural insecticides has impact on human and environmental health, but it also has the potential to impact on insecticide resistance in malaria transmitting mosquitos. To analyse associations between historic use of agricultural insecticides and the distribution of insecticide resistance through space and time, the use and environmental fate of agricultural insecticides needs to be mapped through the same time period. However, data on the use and environmental fate of agricultural insecticides in Africa are limited and therefore risk maps on the spatiotemporal occurrence of agricultural insecticides are created using environmental data. Environmental data on crop density and crop type were used to select the areas that most likely receive insecticides. These areas were verified by a literature review and expert knowledge. Pesticide fate models were compared to select most dominant processes that are involved in the environmental fate of insecticides and that can be mapped at a continental scale. The selected processes include: surface runoff, erosion, infiltration, volatilization and the storing and filtering capacity of soils. The processes indicate the risk for insecticide accumulation in soil, water, sediment and air. A compilation of all available data for traces of insecticides in the environment was used to validate the maps. The risk maps can result in space and time specific measures that reduce the risk of insecticide exposure to non-target organisms.Keywords: crop protection, pesticide fate, tropics, insecticide resistance
Procedia PDF Downloads 14190 Empowering Transformers for Evidence-Based Medicine
Authors: Jinan Fiaidhi, Hashmath Shaik
Abstract:
Breaking the barrier for practicing evidence-based medicine relies on effective methods for rapidly identifying relevant evidence from the body of biomedical literature. An important challenge confronted by medical practitioners is the long time needed to browse, filter, summarize and compile information from different medical resources. Deep learning can help in solving this based on automatic question answering (Q&A) and transformers. However, Q&A and transformer technologies are not trained to answer clinical queries that can be used for evidence-based practice, nor can they respond to structured clinical questioning protocols like PICO (Patient/Problem, Intervention, Comparison and Outcome). This article describes the use of deep learning techniques for Q&A that are based on transformer models like BERT and GPT to answer PICO clinical questions that can be used for evidence-based practice extracted from sound medical research resources like PubMed. We are reporting acceptable clinical answers that are supported by findings from PubMed. Our transformer methods are reaching an acceptable state-of-the-art performance based on two staged bootstrapping processes involving filtering relevant articles followed by identifying articles that support the requested outcome expressed by the PICO question. Moreover, we are also reporting experimentations to empower our bootstrapping techniques with patch attention to the most important keywords in the clinical case and the PICO questions. Our bootstrapped patched with attention is showing relevancy of the evidence collected based on entropy metrics.Keywords: automatic question answering, PICO questions, evidence-based medicine, generative models, LLM transformers
Procedia PDF Downloads 4789 Clouds Influence on Atmospheric Ozone from GOME-2 Satellite Measurements
Authors: S. M. Samkeyat Shohan
Abstract:
This study is mainly focused on the determination and analysis of the photolysis rate of atmospheric, specifically tropospheric, ozone as function of cloud properties through-out the year 2007. The observational basis for ozone concentrations and cloud properties are the measurement data set of the Global Ozone Monitoring Experiment-2 (GOME-2) sensor on board the polar orbiting Metop-A satellite. Two different spectral ranges are used; ozone total column are calculated from the wavelength window 325 – 335 nm, while cloud properties, such as cloud top height (CTH) and cloud optical thick-ness (COT) are derived from the absorption band of molecular oxygen centered at 761 nm. Cloud fraction (CF) is derived from measurements in the ultraviolet, visible and near-infrared range of GOME-2. First, ozone concentrations above clouds are derived from ozone total columns, subtracting the contribution of stratospheric ozone and filtering those satellite measurements which have thin and low clouds. Then, the values of ozone photolysis derived from observations are compared with theoretical modeled results, in the latitudinal belt 5˚N-5˚S and 20˚N - 20˚S, as function of CF and COT. In general, good agreement is found between the data and the model, proving both the quality of the space-borne ozone and cloud properties as well as the modeling theory of ozone photolysis rate. The found discrepancies can, however, amount to approximately 15%. Latitudinal seasonal changes of photolysis rate of ozone are found to be negatively correlated to changes in upper-tropospheric ozone concentrations only in the autumn and summer months within the northern and southern tropical belts, respectively. This fact points to the entangled roles of temperature and nitrogen oxides in the ozone production, which are superimposed on its sole photolysis induced by thick and high clouds in the tropics.Keywords: cloud properties, photolysis rate, stratospheric ozone, tropospheric ozone
Procedia PDF Downloads 21188 An Amended Method for Assessment of Hypertrophic Scars Viscoelastic Parameters
Authors: Iveta Bryjova
Abstract:
Recording of viscoelastic strain-vs-time curves with the aid of the suction method and a follow-up analysis, resulting into evaluation of standard viscoelastic parameters, is a significant technique for non-invasive contact diagnostics of mechanical properties of skin and assessment of its conditions, particularly in acute burns, hypertrophic scarring (the most common complication of burn trauma) and reconstructive surgery. For elimination of the skin thickness contribution, usable viscoelastic parameters deduced from the strain-vs-time curves are restricted to the relative ones (i.e. those expressed as a ratio of two dimensional parameters), like grosselasticity, net-elasticity, biological elasticity or Qu’s area parameters, in literature and practice conventionally referred to as R2, R5, R6, R7, Q1, Q2, and Q3. With the exception of parameters R2 and Q1, the remaining ones substantially depend on the position of inflection point separating the elastic linear and viscoelastic segments of the strain-vs-time curve. The standard algorithm implemented in commercially available devices relies heavily on the experimental fact that the inflection time comes about 0.1 sec after the suction switch-on/off, which depreciates credibility of parameters thus obtained. Although the Qu’s US 7,556,605 patent suggests a method of improving the precision of the inflection determination, there is still room for nonnegligible improving. In this contribution, a novel method of inflection point determination utilizing the advantageous properties of the Savitzky–Golay filtering is presented. The method allows computation of derivatives of smoothed strain-vs-time curve, more exact location of inflection and consequently more reliable values of aforementioned viscoelastic parameters. An improved applicability of the five inflection-dependent relative viscoelastic parameters is demonstrated by recasting a former study under the new method, and by comparing its results with those provided by the methods that have been used so far.Keywords: Savitzky–Golay filter, scarring, skin, viscoelasticity
Procedia PDF Downloads 30487 Theoretical Analysis of Mechanical Vibration for Offshore Platform Structures
Authors: Saeed Asiri, Yousuf Z. AL-Zahrani
Abstract:
A new class of support structures, called periodic structures, is introduced in this paper as a viable means for isolating the vibration transmitted from the sea waves to offshore platform structures through its legs. A passive approach to reduce transmitted vibration generated by waves is presented. The approach utilizes the property of periodic structural components that creates stop and pass bands. The stop band regions can be tailored to correspond to regions of the frequency spectra that contain harmonics of the wave frequency, attenuating the response in those regions. A periodic structural component is comprised of a repeating array of cells, which are themselves an assembly of elements. The elements may have differing material properties as well as geometric variations. For the purpose of this research, only geometric and material variations are considered and each cell is assumed to be identical. A periodic leg is designed in order to reduce transmitted vibration of sea waves. The effectiveness of the periodicity on the vibration levels of platform will be demonstrated theoretically. The theory governing the operation of this class of periodic structures is introduced using the transfer matrix method. The unique filtering characteristics of periodic structures are demonstrated as functions of their design parameters for structures with geometrical and material discontinuities; and determine the propagation factor by using the spectral finite element analysis and the effectiveness of design on the leg structure by changing the ratio of step length and area interface between the materials is demonstrated in order to find the propagation factor and frequency response.Keywords: vibrations, periodic structures, offshore, platforms, transfer matrix method
Procedia PDF Downloads 28986 Reducing CO2 Emission Using EDA and Weighted Sum Model in Smart Parking System
Authors: Rahman Ali, Muhammad Sajjad, Farkhund Iqbal, Muhammad Sadiq Hassan Zada, Mohammed Hussain
Abstract:
Emission of Carbon Dioxide (CO2) has adversely affected the environment. One of the major sources of CO2 emission is transportation. In the last few decades, the increase in mobility of people using vehicles has enormously increased the emission of CO2 in the environment. To reduce CO2 emission, sustainable transportation system is required in which smart parking is one of the important measures that need to be established. To contribute to the issue of reducing the amount of CO2 emission, this research proposes a smart parking system. A cloud-based solution is provided to the drivers which automatically searches and recommends the most preferred parking slots. To determine preferences of the parking areas, this methodology exploits a number of unique parking features which ultimately results in the selection of a parking that leads to minimum level of CO2 emission from the current position of the vehicle. To realize the methodology, a scenario-based implementation is considered. During the implementation, a mobile application with GPS signals, vehicles with a number of vehicle features and a list of parking areas with parking features are used by sorting, multi-level filtering, exploratory data analysis (EDA, Analytical Hierarchy Process (AHP)) and weighted sum model (WSM) to rank the parking areas and recommend the drivers with top-k most preferred parking areas. In the EDA process, “2020testcar-2020-03-03”, a freely available dataset is used to estimate CO2 emission of a particular vehicle. To evaluate the system, results of the proposed system are compared with the conventional approach, which reveal that the proposed methodology supersedes the conventional one in reducing the emission of CO2 into the atmosphere.Keywords: car parking, Co2, Co2 reduction, IoT, merge sort, number plate recognition, smart car parking
Procedia PDF Downloads 14685 A Structured Mechanism for Identifying Political Influencers on Social Media Platforms: Top 10 Saudi Political Twitter Users
Authors: Ahmad Alsolami, Darren Mundy, Manuel Hernandez-Perez
Abstract:
Social media networks, such as Twitter, offer the perfect opportunity to either positively or negatively affect political attitudes on large audiences. The existence of influential users who have developed a reputation for their knowledge and experience of specific topics is a major factor contributing to this impact. Therefore, knowledge of the mechanisms to identify influential users on social media is vital for understanding their effect on their audience. The concept of the influential user is related to the concept of opinion leaders' to indicate that ideas first flow from mass media to opinion leaders and then to the rest of the population. Hence, the objective of this research was to provide reliable and accurate structural mechanisms to identify influential users, which could be applied to different platforms, places, and subjects. Twitter was selected as the platform of interest, and Saudi Arabia as the context for the investigation. These were selected because Saudi Arabia has a large number of Twitter users, some of whom are considerably active in setting agendas and disseminating ideas. The study considered the scientific methods that have been used to identify public opinion leaders before, utilizing metrics software on Twitter. The key findings propose multiple novel metrics to compare Twitter influencers, including the number of followers, social authority and the use of political hashtags, and four secondary filtering measures. Thus, using ratio and percentage calculations to classify the most influential users, Twitter accounts were filtered, analyzed and included. The structured approach is used as a mechanism to explore the top ten influencers on Twitter from the political domain in Saudi Arabia.Keywords: Twitter, influencers, structured mechanism, Saudi Arabia
Procedia PDF Downloads 11884 Enhancing Embedded System Efficiency with Digital Signal Processing Cores
Authors: Anil Dhanawade, Akshay S., Harshal Lakesar
Abstract:
This paper presents a comprehensive analysis of the performance advantages offered by DSP (Digital Signal Processing) cores compared to traditional MCU (Microcontroller Unit) cores in the execution of various functions critical to real-time applications. The focus is on the integration of DSP functionalities, specifically in the context of motor control applications such as Field-Oriented Control (FOC), trigonometric calculations, back-EMF estimation, digital filtering, and high-resolution PWM generation. Through comparative analysis, it is demonstrated that DSP cores significantly enhance processing efficiency, achieving faster execution times for complex mathematical operations essential for precise torque and speed control. The study highlights the capabilities of DSP cores, including single-cycle Multiply-Accumulate (MAC) operations and optimized hardware for trigonometric functions, which collectively reduce latency and improve real-time performance. In contrast, MCU cores, while capable of performing similar tasks, typically exhibit longer execution times due to reliance on software-based solutions and lack of dedicated hardware acceleration. The findings underscore the critical role of DSP cores in applications requiring high-speed processing and low-latency response, making them indispensable in automotive, industrial, and robotics sectors. This work serves as a reference for future developments in embedded systems, emphasizing the importance of architecture choice in achieving optimal performance in demanding computational tasks.Keywords: assembly code, DSP core, instruction set, MCU core
Procedia PDF Downloads 1983 Raman Scattering Broadband Spectrum Generation in Compact Yb-Doped Fiber Laser
Authors: Yanrong Song, Zikai Dong, Runqin Xu, Jinrong Tian, Kexuan Li
Abstract:
Nonlinear polarization rotation (NPR) technique has become one of the main techniques to achieve mode-locked fiber lasers for its compactness, implementation, and low cost. In this paper, we demonstrate a compact mode-locked Yb-doped fiber laser based on NPR technique in the all normal dispersion (ANDi) regime. In the laser cavity, there are no physical filter and polarization controller in laser cavity. Mode-locked pulse train is achieved in ANDi regime based on NPR technique. The fiber birefringence induced filtering effect is the mainly reason for mode-locking. After that, an extra 20 m long single-mode fiber is inserted in two different positions, dissipative soliton operation and noise like pulse operations are achieved correspondingly. The nonlinear effect is obviously enhanced in the noise like pulse regime and broadband spectrum generated owing to enhanced stimulated Raman scattering effect. When the pump power is 210 mW, the central wavelength is 1030 nm, and the corresponding 1st order Raman scattering stokes wave generates and locates at 1075 nm. When the pump power is 370 mW, the 1st and 2nd order Raman scattering stokes wave generate and locate at 1080 nm, 1126 nm respectively. When the pump power is 600 mW, the Raman continuum is generated with cascaded multi-order stokes waves, and the spectrum extends to 1188 nm. The total flat spectrum is from 1000nm to 1200nm. The maximum output average power and pulse energy are 18.0W and 14.75nJ, respectively.Keywords: fiber laser, mode-locking, nonlinear polarization rotation, Raman scattering
Procedia PDF Downloads 22182 Roof and Road Network Detection through Object Oriented SVM Approach Using Low Density LiDAR and Optical Imagery in Misamis Oriental, Philippines
Authors: Jigg L. Pelayo, Ricardo G. Villar, Einstine M. Opiso
Abstract:
The advances of aerial laser scanning in the Philippines has open-up entire fields of research in remote sensing and machine vision aspire to provide accurate timely information for the government and the public. Rapid mapping of polygonal roads and roof boundaries is one of its utilization offering application to disaster risk reduction, mitigation and development. The study uses low density LiDAR data and high resolution aerial imagery through object-oriented approach considering the theoretical concept of data analysis subjected to machine learning algorithm in minimizing the constraints of feature extraction. Since separating one class from another in distinct regions of a multi-dimensional feature-space, non-trivial computing for fitting distribution were implemented to formulate the learned ideal hyperplane. Generating customized hybrid feature which were then used in improving the classifier findings. Supplemental algorithms for filtering and reshaping object features are develop in the rule set for enhancing the final product. Several advantages in terms of simplicity, applicability, and process transferability is noticeable in the methodology. The algorithm was tested in the different random locations of Misamis Oriental province in the Philippines demonstrating robust performance in the overall accuracy with greater than 89% and potential to semi-automation. The extracted results will become a vital requirement for decision makers, urban planners and even the commercial sector in various assessment processes.Keywords: feature extraction, machine learning, OBIA, remote sensing
Procedia PDF Downloads 36281 Design and Implementation of a Software Platform Based on Artificial Intelligence for Product Recommendation
Authors: Giuseppina Settanni, Antonio Panarese, Raffaele Vaira, Maurizio Galiano
Abstract:
Nowdays, artificial intelligence is used successfully in academia and industry for its ability to learn from a large amount of data. In particular, in recent years the use of machine learning algorithms in the field of e-commerce has spread worldwide. In this research study, a prototype software platform was designed and implemented in order to suggest to users the most suitable products for their needs. The platform includes a chatbot and a recommender system based on artificial intelligence algorithms that provide suggestions and decision support to the customer. The recommendation systems perform the important function of automatically filtering and personalizing information, thus allowing to manage with the IT overload to which the user is exposed on a daily basis. Recently, international research has experimented with the use of machine learning technologies with the aim to increase the potential of traditional recommendation systems. Specifically, support vector machine algorithms have been implemented combined with natural language processing techniques that allow the user to interact with the system, express their requests and receive suggestions. The interested user can access the web platform on the internet using a computer, tablet or mobile phone, register, provide the necessary information and view the products that the system deems them most appropriate. The platform also integrates a dashboard that allows the use of the various functions, which the platform is equipped with, in an intuitive and simple way. Artificial intelligence algorithms have been implemented and trained on historical data collected from user browsing. Finally, the testing phase allowed to validate the implemented model, which will be further tested by letting customers use it.Keywords: machine learning, recommender system, software platform, support vector machine
Procedia PDF Downloads 13480 C-eXpress: A Web-Based Analysis Platform for Comparative Functional Genomics and Proteomics in Human Cancer Cell Line, NCI-60 as an Example
Authors: Chi-Ching Lee, Po-Jung Huang, Kuo-Yang Huang, Petrus Tang
Abstract:
Background: Recent advances in high-throughput research technologies such as new-generation sequencing and multi-dimensional liquid chromatography makes it possible to dissect the complete transcriptome and proteome in a single run for the first time. However, it is almost impossible for many laboratories to handle and analysis these “BIG” data without the support from a bioinformatics team. We aimed to provide a web-based analysis platform for users with only limited knowledge on bio-computing to study the functional genomics and proteomics. Method: We use NCI-60 as an example dataset to demonstrate the power of the web-based analysis platform and data delivering system: C-eXpress takes a simple text file that contain the standard NCBI gene or protein ID and expression levels (rpkm or fold) as input file to generate a distribution map of gene/protein expression levels in a heatmap diagram organized by color gradients. The diagram is hyper-linked to a dynamic html table that allows the users to filter the datasets based on various gene features. A dynamic summary chart is generated automatically after each filtering process. Results: We implemented an integrated database that contain pre-defined annotations such as gene/protein properties (ID, name, length, MW, pI); pathways based on KEGG and GO biological process; subcellular localization based on GO cellular component; functional classification based on GO molecular function, kinase, peptidase and transporter. Multiple ways of sorting of column and rows is also provided for comparative analysis and visualization of multiple samples.Keywords: cancer, visualization, database, functional annotation
Procedia PDF Downloads 61979 Real-Time Radar Tracking Based on Nonlinear Kalman Filter
Authors: Milca F. Coelho, K. Bousson, Kawser Ahmed
Abstract:
To accurately track an aerospace vehicle in a time-critical situation and in a highly nonlinear environment, is one of the strongest interests within the aerospace community. The tracking is achieved by estimating accurately the state of a moving target, which is composed of a set of variables that can provide a complete status of the system at a given time. One of the main ingredients for a good estimation performance is the use of efficient estimation algorithms. A well-known framework is the Kalman filtering methods, designed for prediction and estimation problems. The success of the Kalman Filter (KF) in engineering applications is mostly due to the Extended Kalman Filter (EKF), which is based on local linearization. Besides its popularity, the EKF presents several limitations. To address these limitations and as a possible solution to tracking problems, this paper proposes the use of the Ensemble Kalman Filter (EnKF). Although the EnKF is being extensively used in the context of weather forecasting and it is being recognized for producing accurate and computationally effective estimation on systems with a very high dimension, it is almost unknown by the tracking community. The EnKF was initially proposed as an attempt to improve the error covariance calculation, which on the classic Kalman Filter is difficult to implement. Also, in the EnKF method the prediction and analysis error covariances have ensemble representations. These ensembles have sizes which limit the number of degrees of freedom, in a way that the filter error covariance calculations are a lot more practical for modest ensemble sizes. In this paper, a realistic simulation of a radar tracking was performed, where the EnKF was applied and compared with the Extended Kalman Filter. The results suggested that the EnKF is a promising tool for tracking applications, offering more advantages in terms of performance.Keywords: Kalman filter, nonlinear state estimation, optimal tracking, stochastic environment
Procedia PDF Downloads 14778 Application of Single Tuned Passive Filters in Distribution Networks at the Point of Common Coupling
Authors: M. Almutairi, S. Hadjiloucas
Abstract:
The harmonic distortion of voltage is important in relation to power quality due to the interaction between the large diffusion of non-linear and time-varying single-phase and three-phase loads with power supply systems. However, harmonic distortion levels can be reduced by improving the design of polluting loads or by applying arrangements and adding filters. The application of passive filters is an effective solution that can be used to achieve harmonic mitigation mainly because filters offer high efficiency, simplicity, and are economical. Additionally, possible different frequency response characteristics can work to achieve certain required harmonic filtering targets. With these ideas in mind, the objective of this paper is to determine what size single tuned passive filters work in distribution networks best, in order to economically limit violations caused at a given point of common coupling (PCC). This article suggests that a single tuned passive filter could be employed in typical industrial power systems. Furthermore, constrained optimization can be used to find the optimal sizing of the passive filter in order to reduce both harmonic voltage and harmonic currents in the power system to an acceptable level, and, thus, improve the load power factor. The optimization technique works to minimize voltage total harmonic distortions (VTHD) and current total harmonic distortions (ITHD), where maintaining a given power factor at a specified range is desired. According to the IEEE Standard 519, both indices are viewed as constraints for the optimal passive filter design problem. The performance of this technique will be discussed using numerical examples taken from previous publications.Keywords: harmonics, passive filter, power factor, power quality
Procedia PDF Downloads 30677 A Structured Mechanism for Identifying Political Influencers on Social Media Platforms Top 10 Saudi Political Twitter Users
Authors: Ahmad Alsolami, Darren Mundy, Manuel Hernandez-Perez
Abstract:
Social media networks, such as Twitter, offer the perfect opportunity to either positively or negatively affect political attitudes on large audiences. A most important factor contributing to this effect is the existence of influential users, who have developed a reputation for their awareness and experience on specific subjects. Therefore, knowledge of the mechanisms to identify influential users on social media is vital for understanding their effect on their audience. The concept of the influential user is based on the pioneering work of Katz and Lazarsfeld (1959), who created the concept of opinion leaders' to indicate that ideas first flow from mass media to opinion leaders and then to the rest of the population. Hence, the objective of this research was to provide reliable and accurate structural mechanisms to identify influential users, which could be applied to different platforms, places, and subjects. Twitter was selected as the platform of interest, and Saudi Arabia as the context for the investigation. These were selected because Saudi Arabia has a large number of Twitter users, some of whom are considerably active in setting agendas and disseminating ideas. The study considered the scientific methods that have been used to identify public opinion leaders before, utilizing metrics software on Twitter. The key findings propose multiple novel metrics to compare Twitter influencers, including the number of followers, social authority and the use of political hashtags, and four secondary filtering measures. Thus, using ratio and percentage calculations to classify the most influential users, Twitter accounts were filtered, analyzed and included. The structured approach is used as a mechanism to explore the top ten influencers on Twitter from the political domain in Saudi Arabia.Keywords: twitter, influencers, structured mechanism, Saudi Arabia
Procedia PDF Downloads 13776 IOT Based Automated Production and Control System for Clean Water Filtration Through Solar Energy Operated by Submersible Water Pump
Authors: Musse Mohamud Ahmed, Tina Linda Achilles, Mohammad Kamrul Hasan
Abstract:
Deterioration of the mother nature is evident these day with clear danger of human catastrophe emanating from greenhouses (GHG) with increasing CO2 emissions to the environment. PV technology can help to reduce the dependency on fossil fuel, decreasing air pollution and slowing down the rate of global warming. The objective of this paper is to propose, develop and design the production of clean water supply to rural communities using an appropriate technology such as Internet of Things (IOT) that does not create any CO2 emissions. Additionally, maximization of solar energy power output and reciprocally minimizing the natural characteristics of solar sources intermittences during less presence of the sun itself is another goal to achieve in this work. The paper presents the development of critical automated control system for solar energy power output optimization using several new techniques. water pumping system is developed to supply clean water with the application of IOT-renewable energy. This system is effective to provide clean water supply to remote and off-grid areas using Photovoltaics (PV) technology that collects energy generated from the sunlight. The focus of this work is to design and develop a submersible solar water pumping system that applies an IOT implementation. Thus, this system has been executed and programmed using Arduino Software (IDE), proteus, Maltab and C++ programming language. The mechanism of this system is that it pumps water from water reservoir that is powered up by solar energy and clean water production was also incorporated using filtration system through the submersible solar water pumping system. The filtering system is an additional application platform which is intended to provide a clean water supply to any households in Sarawak State, Malaysia.Keywords: IOT, automated production and control system, water filtration, automated submersible water pump, solar energy
Procedia PDF Downloads 9075 Heliport Remote Safeguard System Based on Real-Time Stereovision 3D Reconstruction Algorithm
Authors: Ł. Morawiński, C. Jasiński, M. Jurkiewicz, S. Bou Habib, M. Bondyra
Abstract:
With the development of optics, electronics, and computers, vision systems are increasingly used in various areas of life, science, and industry. Vision systems have a huge number of applications. They can be used in quality control, object detection, data reading, e.g., QR-code, etc. A large part of them is used for measurement purposes. Some of them make it possible to obtain a 3D reconstruction of the tested objects or measurement areas. 3D reconstruction algorithms are mostly based on creating depth maps from data that can be acquired from active or passive methods. Due to the specific appliance in airfield technology, only passive methods are applicable because of other existing systems working on the site, which can be blinded on most spectral levels. Furthermore, reconstruction is required to work long distances ranging from hundreds of meters to tens of kilometers with low loss of accuracy even with harsh conditions such as fog, rain, or snow. In response to those requirements, HRESS (Heliport REmote Safeguard System) was developed; which main part is a rotational head with a two-camera stereovision rig gathering images around the head in 360 degrees along with stereovision 3D reconstruction and point cloud combination. The sub-pixel analysis introduced in the HRESS system makes it possible to obtain an increased distance measurement resolution and accuracy of about 3% for distances over one kilometer. Ultimately, this leads to more accurate and reliable measurement data in the form of a point cloud. Moreover, the program algorithm introduces operations enabling the filtering of erroneously collected data in the point cloud. All activities from the programming, mechanical and optical side are aimed at obtaining the most accurate 3D reconstruction of the environment in the measurement area.Keywords: airfield monitoring, artificial intelligence, stereovision, 3D reconstruction
Procedia PDF Downloads 12574 An Evolutionary Perspective on the Role of Extrinsic Noise in Filtering Transcript Variability in Small RNA Regulation in Bacteria
Authors: Rinat Arbel-Goren, Joel Stavans
Abstract:
Cell-to-cell variations in transcript or protein abundance, called noise, may give rise to phenotypic variability between isogenic cells, enhancing the probability of survival under stress conditions. These variations may be introduced by post-transcriptional regulatory processes such as non-coding, small RNAs stoichiometric degradation of target transcripts in bacteria. We study the iron homeostasis network in Escherichia coli, in which the RyhB small RNA regulates the expression of various targets as a model system. Using fluorescence reporter genes to detect protein levels and single-molecule fluorescence in situ hybridization to monitor transcripts levels in individual cells, allows us to compare noise at both transcript and protein levels. The experimental results and computer simulations show that extrinsic noise buffers through a feed-forward loop configuration the increase in variability introduced at the transcript level by iron deprivation, illuminating the important role that extrinsic noise plays during stress. Surprisingly, extrinsic noise also decouples of fluctuations of two different targets, in spite of RyhB being a common upstream factor degrading both. Thus, phenotypic variability increases under stress conditions by the decoupling of target fluctuations in the same cell rather than by increasing the noise of each. We also present preliminary results on the adaptation of cells to prolonged iron deprivation in order to shed light on the evolutionary role of post-transcriptional downregulation by small RNAs.Keywords: cell-to-cell variability, Escherichia coli, noise, single-molecule fluorescence in situ hybridization (smFISH), transcript
Procedia PDF Downloads 16473 Development of a Multi-Locus DNA Metabarcoding Method for Endangered Animal Species Identification
Authors: Meimei Shi
Abstract:
Objectives: The identification of endangered species, especially simultaneous detection of multiple species in complex samples, plays a critical role in alleged wildlife crime incidents and prevents illegal trade. This study was to develop a multi-locus DNA metabarcoding method for endangered animal species identification. Methods: Several pairs of universal primers were designed according to the mitochondria conserved gene regions. Experimental mixtures were artificially prepared by mixing well-defined species, including endangered species, e.g., forest musk, bear, tiger, pangolin, and sika deer. The artificial samples were prepared with 1-16 well-characterized species at 1% to 100% DNA concentrations. After multiplex-PCR amplification and parameter modification, the amplified products were analyzed by capillary electrophoresis and used for NGS library preparation. The DNA metabarcoding was carried out based on Illumina MiSeq amplicon sequencing. The data was processed with quality trimming, reads filtering, and OTU clustering; representative sequences were blasted using BLASTn. Results: According to the parameter modification and multiplex-PCR amplification results, five primer sets targeting COI, Cytb, 12S, and 16S, respectively, were selected as the NGS library amplification primer panel. High-throughput sequencing data analysis showed that the established multi-locus DNA metabarcoding method was sensitive and could accurately identify all species in artificial mixtures, including endangered animal species Moschus berezovskii, Ursus thibetanus, Panthera tigris, Manis pentadactyla, Cervus nippon at 1% (DNA concentration). In conclusion, the established species identification method provides technical support for customs and forensic scientists to prevent the illegal trade of endangered animals and their products.Keywords: DNA metabarcoding, endangered animal species, mitochondria nucleic acid, multi-locus
Procedia PDF Downloads 14072 Single Tuned Shunt Passive Filter Based Current Harmonic Elimination of Three Phase AC-DC Converters
Authors: Mansoor Soomro
Abstract:
The evolution of power electronic equipment has been pivotal in making industrial processes productive, efficient and safe. Despite its attractive features, it has been due to nonlinear loads which make it vulnerable to power quality conditions. Harmonics is one of the power quality problem in which the harmonic frequency is integral multiple of supply frequency. Therefore, the supply voltage and supply frequency do not last within their tolerable limits. As a result, distorted current and voltage waveform may appear. Attributes of low power quality confirm that an electrical device or equipment is likely to malfunction, fail promptly or unable to operate under all applied conditions. The electrical power system is designed for delivering power reliably, namely maximizing power availability to customers. However, power quality events are largely untracked, and as a result, can take out a process as many as 20 to 30 times a year, costing utilities, customers and suppliers of load equipment, a loss of millions of dollars. The ill effects of current harmonics reduce system efficiency, cause overheating of connected equipment, result increase in electrical power and air conditioning costs. With the passage of time and the rapid growth of power electronic converters has highlighted the damages of current harmonics in the electrical power system. Therefore, it has become essential to address the bad influence of current harmonics while planning any suitable changes in the electrical installations. In this paper, an effort has been made to mitigate the effects of dominant 3rd order current harmonics. Passive filtering technique with six pulse multiplication converter has been employed to mitigate them. Since, the standards of power quality are to maintain the supply voltage and supply current within certain prescribed standard limits. For this purpose, the obtained results are validated as per specifications of IEEE 519-1992 and IEEE 519-2014 performance standards.Keywords: current harmonics, power quality, passive filters, power electronic converters
Procedia PDF Downloads 30171 Context-Aware Recommender Systems Using User's Emotional State
Authors: Hoyeon Park, Kyoung-jae Kim
Abstract:
The product recommendation is a field of research that has received much attention in the recent information overload phenomenon. The proliferation of the mobile environment and social media cannot help but affect the results of the recommendation depending on how the factors of the user's situation are reflected in the recommendation process. Recently, research has been spreading attention to the context-aware recommender system which is to reflect user's contextual information in the recommendation process. However, until now, most of the context-aware recommender system researches have been limited in that they reflect the passive context of users. It is expected that the user will be able to express his/her contextual information through his/her active behavior and the importance of the context-aware recommender system reflecting this information can be increased. The purpose of this study is to propose a context-aware recommender system that can reflect the user's emotional state as an active context information to recommendation process. The context-aware recommender system is a recommender system that can make more sophisticated recommendations by utilizing the user's contextual information and has an advantage that the user's emotional factor can be considered as compared with the existing recommender systems. In this study, we propose a method to infer the user's emotional state, which is one of the user's context information, by using the user's facial expression data and to reflect it on the recommendation process. This study collects the facial expression data of a user who is looking at a specific product and the user's product preference score. Then, we classify the facial expression data into several categories according to the previous research and construct a model that can predict them. Next, the predicted results are applied to existing collaborative filtering with contextual information. As a result of the study, it was shown that the recommended results of the context-aware recommender system including facial expression information show improved results in terms of recommendation performance. Based on the results of this study, it is expected that future research will be conducted on recommender system reflecting various contextual information.Keywords: context-aware, emotional state, recommender systems, business analytics
Procedia PDF Downloads 23070 Remote Vital Signs Monitoring in Neonatal Intensive Care Unit Using a Digital Camera
Authors: Fatema-Tuz-Zohra Khanam, Ali Al-Naji, Asanka G. Perera, Kim Gibson, Javaan Chahl
Abstract:
Conventional contact-based vital signs monitoring sensors such as pulse oximeters or electrocardiogram (ECG) may cause discomfort, skin damage, and infections, particularly in neonates with fragile, sensitive skin. Therefore, remote monitoring of the vital sign is desired in both clinical and non-clinical settings to overcome these issues. Camera-based vital signs monitoring is a recent technology for these applications with many positive attributes. However, there are still limited camera-based studies on neonates in a clinical setting. In this study, the heart rate (HR) and respiratory rate (RR) of eight infants at the Neonatal Intensive Care Unit (NICU) in Flinders Medical Centre were remotely monitored using a digital camera applying color and motion-based computational methods. The region-of-interest (ROI) was efficiently selected by incorporating an image decomposition method. Furthermore, spatial averaging, spectral analysis, band-pass filtering, and peak detection were also used to extract both HR and RR. The experimental results were validated with the ground truth data obtained from an ECG monitor and showed a strong correlation using the Pearson correlation coefficient (PCC) 0.9794 and 0.9412 for HR and RR, respectively. The RMSE between camera-based data and ECG data for HR and RR were 2.84 beats/min and 2.91 breaths/min, respectively. A Bland Altman analysis of the data also showed a close correlation between both data sets with a mean bias of 0.60 beats/min and 1 breath/min, and the lower and upper limit of agreement -4.9 to + 6.1 beats/min and -4.4 to +6.4 breaths/min for both HR and RR, respectively. Therefore, video camera imaging may replace conventional contact-based monitoring in NICU and has potential applications in other contexts such as home health monitoring.Keywords: neonates, NICU, digital camera, heart rate, respiratory rate, image decomposition
Procedia PDF Downloads 10469 Improving Search Engine Performance by Removing Indexes to Malicious URLs
Authors: Durga Toshniwal, Lokesh Agrawal
Abstract:
As the web continues to play an increasing role in information exchange, and conducting daily activities, computer users have become the target of miscreants which infects hosts with malware or adware for financial gains. Unfortunately, even a single visit to compromised web site enables the attacker to detect vulnerabilities in the user’s applications and force the downloading of multitude of malware binaries. We provide an approach to effectively scan the so-called drive-by downloads on the Internet. Drive-by downloads are result of URLs that attempt to exploit their visitors and cause malware to be installed and run automatically. To scan the web for malicious pages, the first step is to use a crawler to collect URLs that live on the Internet, and then to apply fast prefiltering techniques to reduce the amount of pages that are needed to be examined by precise, but slower, analysis tools (such as honey clients or antivirus programs). Although the technique is effective, it requires a substantial amount of resources. A main reason is that the crawler encounters many pages on the web that are legitimate and needs to be filtered. In this paper, to characterize the nature of this rising threat, we present implementation of a web crawler on Python, an approach to search the web more efficiently for pages that are likely to be malicious, filtering benign pages and passing remaining pages to antivirus program for detection of malwares. Our approaches starts from an initial seed of known, malicious web pages. Using these seeds, our system generates search engines queries to identify other malicious pages that are similar to the ones in the initial seed. By doing so, it leverages the crawling infrastructure of search engines to retrieve URLs that are much more likely to be malicious than a random page on the web. The results shows that this guided approach is able to identify malicious web pages more efficiently when compared to random crawling-based approaches.Keywords: web crawler, malwares, seeds, drive-by-downloads, security
Procedia PDF Downloads 22968 Assimilating Multi-Mission Satellites Data into a Hydrological Model
Authors: Mehdi Khaki, Ehsan Forootan, Joseph Awange, Michael Kuhn
Abstract:
Terrestrial water storage, as a source of freshwater, plays an important role in human lives. Hydrological models offer important tools for simulating and predicting water storages at global and regional scales. However, their comparisons with 'reality' are imperfect mainly due to a high level of uncertainty in input data and limitations in accounting for all complex water cycle processes, uncertainties of (unknown) empirical model parameters, as well as the absence of high resolution (both spatially and temporally) data. Data assimilation can mitigate this drawback by incorporating new sets of observations into models. In this effort, we use multi-mission satellite-derived remotely sensed observations to improve the performance of World-Wide Water Resources Assessment system (W3RA) hydrological model for estimating terrestrial water storages. For this purpose, we assimilate total water storage (TWS) data from the Gravity Recovery And Climate Experiment (GRACE) and surface soil moisture data from the Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E) into W3RA. This is done to (i) improve model estimations of water stored in ground and soil moisture, and (ii) assess the impacts of each satellite of data (from GRACE and AMSR-E) and their combination on the final terrestrial water storage estimations. These data are assimilated into W3RA using the Ensemble Square-Root Filter (EnSRF) filtering technique over Mississippi Basin (the United States) and Murray-Darling Basin (Australia) between 2002 and 2013. In order to evaluate the results, independent ground-based groundwater and soil moisture measurements within each basin are used.Keywords: data assimilation, GRACE, AMSR-E, hydrological model, EnSRF
Procedia PDF Downloads 290