Search results for: estimation algorithms
1545 Study of the Best Algorithm to Estimate Sunshine Duration from Global Radiation on Horizontal Surface for Tropical Region
Authors: Tovondahiniriko Fanjirindratovo, Olga Ramiarinjanahary, Paulisimone Rasoavonjy
Abstract:
The sunshine duration, which is the sum of all the moments when the solar beam radiation is up to a minimal value, is an important parameter for climatology, tourism, agriculture and solar energy. Its measure is usually given by a pyrheliometer installed on a two-axis solar tracker. Due to the high cost of this device and the availability of global radiation on a horizontal surface, on the other hand, several studies have been done to make a correlation between global radiation and sunshine duration. Most of these studies are fitted for the northern hemisphere using a pyrheliometric database. The aim of the present work is to list and assess all the existing methods and apply them to Reunion Island, a tropical region in the southern hemisphere. Using a database of ten years, global, diffuse and beam radiation for a horizontal surface are employed in order to evaluate the uncertainty of existing algorithms for a tropical region. The methodology is based on indirect comparison because the solar beam radiation is not measured but calculated by the beam radiation on a horizontal surface and the sun elevation angle.Keywords: Carpentras method, data fitting, global radiation, sunshine duration, Slob and Monna algorithm, step algorithm
Procedia PDF Downloads 1211544 Discussion on Dispersion Curves of Non-penetrable Soils from in-Situ Seismic Dilatometer Measurements
Authors: Angelo Aloisio Dag, Pasquale Pasca, Massimo Fragiacomo, Ferdinando Totani, Gianfranco Totani
Abstract:
The estimate of the velocity of shear waves (Vs) is essential in seismic engineering to characterize the dynamic response of soils. There are various direct methods to estimate the Vs. The authors report the results of site characterization in Macerata, where they measured the Vs using the seismic dilatometer in a 100m deep borehole. The standard Vs estimation originates from the cross-correlation between the signals acquired by two geophones at increasing depths. This paper focuses on the estimate of the dependence of Vs on the wavenumber. The dispersion curves reveal an unexpected hyperbolic dispersion curve typical of Lamb waves. Interestingly, the contribution of Lamb waves may be notable up to 100m depth. The amplitude of surface waves decrease rapidly with depth: still, their influence may be essential up to depths considered unusual for standard geotechnical investigations, where their effect is generally neglected. Accordingly, these waves may bias the outcomes of the standard Vs estimations, which ignore frequency-dependent phenomena. The paper proposes an enhancement of the accepted procedure to estimate Vs and addresses the importance of Lamb waves in soil characterization.Keywords: dispersion curve, seismic dilatometer, shear wave, soil mechanics
Procedia PDF Downloads 1701543 Effects of Porosity Logs on Pore Connectivity and Volumetric Estimation
Authors: Segun S. Bodunde
Abstract:
In Bona Field, Niger Delta, two reservoirs across three wells were analyzed. The research aimed at determining the statistical dependence of permeability and oil volume in place on porosity logs. Of the three popular porosity logs, two were used; the sonic and density logs. The objectives of the research were to identify the porosity logs that vary more with location and direction, to visualize the depth trend of both logs and to determine the influence of these logs on pore connectivity determination and volumetric analysis. The focus was on density and sonic logs. It was observed that the sonic derived porosities were higher than the density derived porosities (in well two, across the two reservoir sands, sonic porosity averaged 30.8% while density derived porosity averaged 23.65%, and the same trend was observed in other wells.). The sonic logs were further observed to have lower co-efficient of variation when compared to the density logs (in sand A, well 2, sonic derived porosity had a co-efficient of variation of 12.15% compared to 22.52% from the density logs) indicating a lower tendency to vary with location and direction. The bulk density was observed to increase with depth while the transit time reduced with depth. It was also observed that for an 8.87% decrease in porosity, the pore connectivity was observed to decrease by about 38%.Keywords: pore connectivity, co-efficient of variation, density derived porosity, sonic derived porosity
Procedia PDF Downloads 1891542 Application of Decline Curve Analysis to Depleted Wells in a Cluster and then Predicting the Performance of Currently Flowing Wells
Authors: Satish Kumar Pappu
Abstract:
The most common questions which are frequently asked in oil and gas industry are how much is the current production rate from a particular well and what is the approximate predicted life of that well. These questions can be answered through forecasting of important realistic data like flowing tubing hole pressures FTHP, Production decline curves which are used predict the future performance of a well in a reservoir. With the advent of directional drilling, cluster well drilling has gained much importance and in-fact has even revolutionized the whole world of oil and gas industry. An oil or gas reservoir can generally be described as a collection of several overlying, producing and potentially producing sands in to which a number of wells are drilled depending upon the in-place volume and several other important factors both technical and economical in nature, in some sands only one well is drilled and in some, more than one. The aim of this study is to derive important information from the data collected over a period of time at regular intervals on a depleted well in a reservoir sand and apply this information to predict the performance of other wells in that reservoir sand. The depleted wells are the most common observations when an oil or gas field is being visited, w the application of this study more realistic in nature.Keywords: decline curve analysis, estimation of future gas reserves, reservoir sands, reservoir risk profile
Procedia PDF Downloads 4331541 A Versatile Algorithm to Propose Optimized Solutions to the Dengue Disease Problem
Authors: Fernando L. P. Santos, Luiz G. Lyra, Helenice O. Florentino, Daniela R. Cantane
Abstract:
Dengue is a febrile infectious disease caused by a virus of the family Flaviridae. It is transmitted by the bite of mosquitoes, usually of the genus Aedes aegypti. It occurs in tropical and subtropical areas of the world. This disease has been a major public health problem worldwide, especially in tropical countries such as Brazil, and its incidence has increased in recent years. Dengue is a subject of intense research. Efficient forms of mosquito control must be considered. In this work, the mono-objective optimal control problem was solved for analysing the dengue disease problem. Chemical and biological controls were considered in the mathematical aspect. This model describes the dynamics of mosquitoes in water and winged phases. We applied the genetic algorithms (GA) to obtain optimal strategies for the control of dengue. Numerical simulations have been performed to verify the versatility and the applicability of this algorithm. On the basis of the present results we may recommend the GA to solve optimal control problem with a large region of feasibility.Keywords: genetic algorithm, dengue, Aedes aegypti, biological control, chemical control
Procedia PDF Downloads 3451540 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data
Authors: Adarsh Shroff
Abstract:
Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.Keywords: big data, map reduce, incremental processing, iterative computation
Procedia PDF Downloads 3451539 R Data Science for Technology Management
Authors: Sunghae Jun
Abstract:
Technology management (TM) is important issue in a company improving the competitiveness. Among many activities of TM, technology analysis (TA) is important factor, because most decisions for management of technology are decided by the results of TA. TA is to analyze the developed results of target technology using statistics or Delphi. TA based on Delphi is depended on the experts’ domain knowledge, in comparison, TA by statistics and machine learning algorithms use objective data such as patent or paper instead of the experts’ knowledge. Many quantitative TA methods based on statistics and machine learning have been studied, and these have been used for technology forecasting, technological innovation, and management of technology. They applied diverse computing tools and many analytical methods case by case. It is not easy to select the suitable software and statistical method for given TA work. So, in this paper, we propose a methodology for quantitative TA using statistical computing software called R and data science to construct a general framework of TA. From the result of case study, we also show how our methodology is applied to real field. This research contributes to R&D planning and technology valuation in TM areas.Keywords: technology management, R system, R data science, statistics, machine learning
Procedia PDF Downloads 4511538 Diffusion MRI: Clinical Application in Radiotherapy Planning of Intracranial Pathology
Authors: Pomozova Kseniia, Gorlachev Gennadiy, Chernyaev Aleksandr, Golanov Andrey
Abstract:
In clinical practice, and especially in stereotactic radiosurgery planning, the significance of diffusion-weighted imaging (DWI) is growing. This makes the existence of software capable of quickly processing and reliably visualizing diffusion data, as well as equipped with tools for their analysis in terms of different tasks. We are developing the «MRDiffusionImaging» software on the standard C++ language. The subject part has been moved to separate class libraries and can be used on various platforms. The user interface is Windows WPF (Windows Presentation Foundation), which is a technology for managing Windows applications with access to all components of the .NET 5 or .NET Framework platform ecosystem. One of the important features is the use of a declarative markup language, XAML (eXtensible Application Markup Language), with which you can conveniently create, initialize and set properties of objects with hierarchical relationships. Graphics are generated using the DirectX environment. The MRDiffusionImaging software package has been implemented for processing diffusion magnetic resonance imaging (dMRI), which allows loading and viewing images sorted by series. An algorithm for "masking" dMRI series based on T2-weighted images was developed using a deformable surface model to exclude tissues that are not related to the area of interest from the analysis. An algorithm of distortion correction using deformable image registration based on autocorrelation of local structure has been developed. Maximum voxel dimension was 1,03 ± 0,12 mm. In an elementary brain's volume, the diffusion tensor is geometrically interpreted using an ellipsoid, which is an isosurface of the probability density of a molecule's diffusion. For the first time, non-parametric intensity distributions, neighborhood correlations, and inhomogeneities are combined in one segmentation of white matter (WM), grey matter (GM), and cerebrospinal fluid (CSF) algorithm. A tool for calculating the coefficient of average diffusion and fractional anisotropy has been created, on the basis of which it is possible to build quantitative maps for solving various clinical problems. Functionality has been created that allows clustering and segmenting images to individualize the clinical volume of radiation treatment and further assess the response (Median Dice Score = 0.963 ± 0,137). White matter tracts of the brain were visualized using two algorithms: deterministic (fiber assignment by continuous tracking) and probabilistic using the Hough transform. The proposed algorithms test candidate curves in the voxel, assigning to each one a score computed from the diffusion data, and then selects the curves with the highest scores as the potential anatomical connections. White matter fibers were visualized using a Hough transform tractography algorithm. In the context of functional radiosurgery, it is possible to reduce the irradiation volume of the internal capsule receiving 12 Gy from 0,402 cc to 0,254 cc. The «MRDiffusionImaging» will improve the efficiency and accuracy of diagnostics and stereotactic radiotherapy of intracranial pathology. We develop software with integrated, intuitive support for processing, analysis, and inclusion in the process of radiotherapy planning and evaluating its results.Keywords: diffusion-weighted imaging, medical imaging, stereotactic radiosurgery, tractography
Procedia PDF Downloads 801537 An Alternative Method for Computing Clothoids
Authors: Gerardo Casal, Miguel E. Vázquez-Méndez
Abstract:
The clothoid (also known as Cornu spiral or Euler spiral) is a curve that is characterized because its curvature is proportional to its length. This property makes that it would be widely used as transition curve for designing the layout of roads and railway tracks. In this work, from the geometrical property characterizing the clothoid, its parametric equations are obtained and two algorithms to compute it are compared. The first (classical), is widely used in Surveying Schools and it is based on the use of explicit formulas obtained from Taylor expansions of sine and cosine functions. The second one (alternative) is a very simple algorithm, based on the numerical solution of the initial value problems giving the clothoid parameterization. Both methods are compared in some typical surveying problems. The alternative method does not use complex formulas and so it is conceptually very simple and easy to apply. It gives good results, even if the classical method goes wrong (if the quotient between length and radius of curvature is high), needs no subsequent translations nor rotations and, consequently, it seems an efficient tool for designing the layout of roads and railway tracks.Keywords: transition curves, railroad and highway engineering, Runge-Kutta methods
Procedia PDF Downloads 2771536 Hierarchical Queue-Based Task Scheduling with CloudSim
Authors: Wanqing You, Kai Qian, Ying Qian
Abstract:
The concepts of Cloud Computing provide users with infrastructure, platform and software as service, which make those services more accessible for people via Internet. To better analysis the performance of Cloud Computing provisioning policies as well as resources allocation strategies, a toolkit named CloudSim proposed. With CloudSim, the Cloud Computing environment can be easily constructed by modelling and simulating cloud computing components, such as datacenter, host, and virtual machine. A good scheduling strategy is the key to achieve the load balancing among different machines as well as to improve the utilization of basic resources. Recently, the existing scheduling algorithms may work well in some presumptive cases in a single machine; however they are unable to make the best decision for the unforeseen future. In real world scenario, there would be numbers of tasks as well as several virtual machines working in parallel. Based on the concepts of multi-queue, this paper presents a new scheduling algorithm to schedule tasks with CloudSim by taking into account several parameters, the machines’ capacity, the priority of tasks and the history log.Keywords: hierarchical queue, load balancing, CloudSim, information technology
Procedia PDF Downloads 4181535 Torsional Rigidities of Reinforced Concrete Beams Subjected to Elastic Lateral Torsional Buckling
Authors: Ilker Kalkan, Saruhan Kartal
Abstract:
Reinforced concrete (RC) beams rarely undergo lateral-torsional buckling (LTB), since these beams possess large lateral bending and torsional rigidities owing to their stocky cross-sections, unlike steel beams. However, the problem of LTB is becoming more and more pronounced in the last decades as the span lengths of concrete beams increase and the cross-sections become more slender with the use of pre-stressed concrete. The buckling moment of a beam mainly depends on its lateral bending rigidity and torsional rigidity. The nonhomogeneous and elastic-inelastic nature of RC complicates estimation of the buckling moments of concrete beams. Furthermore, the lateral bending and torsional rigidities of RC beams and the buckling moments are affected from different forms of concrete cracking, including flexural, torsional and restrained shrinkage cracking. The present study pertains to the effects of concrete cracking on the torsional rigidities of RC beams prone to elastic LTB. A series of tests on rather slender RC beams indicated that torsional cracking does not initiate until buckling in elastic LTB, while flexural cracking associated with lateral bending takes place even at the initial stages of loading. Hence, the present study clearly indicated that the un-cracked torsional rigidity needs to be used for estimating the buckling moments of RC beams liable to elastic LTB.Keywords: lateral stability, post-cracking torsional rigidity, uncracked torsional rigidity, critical moment
Procedia PDF Downloads 2331534 Collision Detection Algorithm Based on Data Parallelism
Authors: Zhen Peng, Baifeng Wu
Abstract:
Modern computing technology enters the era of parallel computing with the trend of sustainable and scalable parallelism. Single Instruction Multiple Data (SIMD) is an important way to go along with the trend. It is able to gather more and more computing ability by increasing the number of processor cores without the need of modifying the program. Meanwhile, in the field of scientific computing and engineering design, many computation intensive applications are facing the challenge of increasingly large amount of data. Data parallel computing will be an important way to further improve the performance of these applications. In this paper, we take the accurate collision detection in building information modeling as an example. We demonstrate a model for constructing a data parallel algorithm. According to the model, a complex object is decomposed into the sets of simple objects; collision detection among complex objects is converted into those among simple objects. The resulting algorithm is a typical SIMD algorithm, and its advantages in parallelism and scalability is unparalleled in respect to the traditional algorithms.Keywords: data parallelism, collision detection, single instruction multiple data, building information modeling, continuous scalability
Procedia PDF Downloads 2841533 ANN Based Simulation of PWM Scheme for Seven Phase Voltage Source Inverter Using MATLAB/Simulink
Authors: Mohammad Arif Khan
Abstract:
This paper analyzes and presents the development of Artificial Neural Network based controller of space vector modulation (ANN-SVPWM) for a seven-phase voltage source inverter. At first, the conventional method of producing sinusoidal output voltage by utilizing six active and one zero space vectors are used to synthesize the input reference, is elaborated and then new PWM scheme called Artificial Neural Network Based PWM is presented. The ANN based controller has the advantage of the very fast implementation and analyzing the algorithms and avoids the direct computation of trigonometric and non-linear functions. The ANN controller uses the individual training strategy with the fixed weight and supervised models. A computer simulation program has been developed using Matlab/Simulink together with the neural network toolbox for training the ANN-controller. A comparison of the proposed scheme with the conventional scheme is presented based on various performance indices. Extensive Simulation results are provided to validate the findings.Keywords: space vector PWM, total harmonic distortion, seven-phase, voltage source inverter, multi-phase, artificial neural network
Procedia PDF Downloads 4501532 Tuning Fractional Order Proportional-Integral-Derivative Controller Using Hybrid Genetic Algorithm Particle Swarm and Differential Evolution Optimization Methods for Automatic Voltage Regulator System
Authors: Fouzi Aboura
Abstract:
The fractional order proportional-integral-derivative (FOPID) controller or fractional order (PIλDµ) is a proportional-integral-derivative (PID) controller where integral order (λ) and derivative order (µ) are fractional, one of the important application of classical PID is the Automatic Voltage Regulator (AVR).The FOPID controller needs five parameters optimization while the design of conventional PID controller needs only three parameters to be optimized. In our paper we have proposed a comparison between algorithms Differential Evolution (DE) and Hybrid Genetic Algorithm Particle Swarm Optimization (HGAPSO) ,we have studied theirs characteristics and performance analysis to find an optimum parameters of the FOPID controller, a new objective function is also proposed to take into account the relation between the performance criteria’s.Keywords: FOPID controller, fractional order, AVR system, objective function, optimization, GA, PSO, HGAPSO
Procedia PDF Downloads 871531 Integrated Navigation System Using Simplified Kalman Filter Algorithm
Authors: Othman Maklouf, Abdunnaser Tresh
Abstract:
GPS and inertial navigation system (INS) have complementary qualities that make them ideal use for sensor fusion. The limitations of GPS include occasional high noise content, outages when satellite signals are blocked, interference and low bandwidth. The strengths of GPS include its long-term stability and its capacity to function as a stand-alone navigation system. In contrast, INS is not subject to interference or outages, have high bandwidth and good short-term noise characteristics, but have long-term drift errors and require external information for initialization. A combined system of GPS and INS subsystems can exhibit the robustness, higher bandwidth and better noise characteristics of the inertial system with the long-term stability of GPS. The most common estimation algorithm used in integrated INS/GPS is the Kalman Filter (KF). KF is able to take advantages of these characteristics to provide a common integrated navigation implementation with performance superior to that of either subsystem (GPS or INS). This paper presents a simplified KF algorithm for land vehicle navigation application. In this integration scheme, the GPS derived positions and velocities are used as the update measurements for the INS derived PVA. The KF error state vector in this case includes the navigation parameters as well as the accelerometer and gyroscope error states.Keywords: GPS, INS, Kalman filter, inertial navigation system
Procedia PDF Downloads 4661530 Atmospheric Transport Modeling of Radio-Xenon Detections Possibly Related to the Announced Nuclear Test in North Korea on February 12, 2013
Authors: Kobi Kutsher
Abstract:
On February 12th 2013, monitoring stations of the Preparatory Commission of the Comprehensive Nuclear Test-Ban Treaty Organization (CTBTO) detected a seismic event with explosion-like underground characteristics in the Democratic People’s Republic of Korea (DPRK). The location was found to be in the vicinity of the two previous announced nuclear tests in 2006 and 2009.The nuclear test was also announced by the government of the DPRK.After an underground nuclear explosion, radioactive fission products (mostly noble gases) can seep through layers of rock and sediment until they escape into the atmosphere. The fission products are dispersed in the atmosphere and may be detected thousands of kilometers downwind from the test site. Indeed, more than 7 weeks after the explosion, unusual detections of noble gases was reported at the radionuclide station in Takasaki, Japan. The radionuclide station is a part of the International Monitoring System, operated to verify the CTBT. This study provides an estimation of the possible source region and the total radioactivity of the release using Atmospheric Transport Modeling.Keywords: atmospheric transport modeling, CTBTO, nuclear tests, radioactive fission products
Procedia PDF Downloads 4221529 Quantum Entangled States and Image Processing
Authors: Sanjay Singh, Sushil Kumar, Rashmi Jain
Abstract:
Quantum registering is another pattern in computational hypothesis and a quantum mechanical framework has a few helpful properties like Entanglement. We plan to store data concerning the structure and substance of a basic picture in a quantum framework. Consider a variety of n qubits which we propose to use as our memory stockpiling. In recent years classical processing is switched to quantum image processing. Quantum image processing is an elegant approach to overcome the problems of its classical counter parts. Image storage, retrieval and its processing on quantum machines is an emerging area. Although quantum machines do not exist in physical reality but theoretical algorithms developed based on quantum entangled states gives new insights to process the classical images in quantum domain. Here in the present work, we give the brief overview, such that how entangled states can be useful for quantum image storage and retrieval. We discuss the properties of tripartite Greenberger-Horne-Zeilinger and W states and their usefulness to store the shapes which may consist three vertices. We also propose the techniques to store shapes having more than three vertices.Keywords: Greenberger-Horne-Zeilinger, image storage and retrieval, quantum entanglement, W states
Procedia PDF Downloads 3021528 Estimation of Serum Levels of Calcium and Inorganic Phosphorus in Breast Cancer Patients
Authors: Safa Safdar
Abstract:
Breast cancer is a type of cancer which is developed by the formation of a tumor on the breast. This tumor invades and causes different electrolyte imbalance. The present study was designed to measure the serum calcium and inorganic phosphorous levels and to check the frequency of hypercalcemia and hypophosphatemia in breast cancer patients. Serum calcium and phosphorous levels of fifty breast cancer women of 18-70 years of age group and fifty healthy women of same age group were measured by using semi-automated chemistry analyzer ( Humalyzer 3000, Human, Germany ). Significant variation in these levels was observed. The mean calcium value in BC patients was higher 9.398 mg/dl as compared to controls which were 8.694 mg/dl. Whereas the mean value of inorganic phosphorus level was lower 4.060 mg/dl in BC patients as compared to controls having 4.456 mg/dl. In this study, the frequency of hypercalcemia in Breast cancer patients was 10% i.e. only 10 out of 50 Breast cancer patients were suffering from hypercalcemia. Whereas the frequency of hypophosphatemia in this study was only 2 % i.e. only 1 out of 50 patients was suffering from hypophosphatemia. Thus it is concluded that there is a significant change in serum calcium and inorganic phosphorous levels in Breast cancer patients as the disease progresses. So, this study will be helpful for the clinicians to maintain serum calcium and phosphorous levels in Breast cancer patients and also preventing them from further complications.Keywords: serum analysis, calcium, inorganic phosphorus, hpercalcemia hypophosphatemia
Procedia PDF Downloads 2891527 A Decision Support System for the Detection of Illicit Substance Production Sites
Authors: Krystian Chachula, Robert Nowak
Abstract:
Manufacturing home-made explosives and synthetic drugs is an increasing problem in Europe. To combat that, a data fusion system is proposed for the detection and localization of production sites in urban environments. The data consists of measurements of properties of wastewater performed by various sensors installed in a sewage network. A four-stage fusion strategy allows detecting sources of waste products from known chemical reactions. First, suspicious measurements are used to compute the amount and position of discharged compounds. Then, this information is propagated through the sewage network to account for missing sensors. The next step is clustering and the formation of tracks. Eventually, tracks are used to reconstruct discharge events. Sensor measurements are simulated by a subsystem based on real-world data. In this paper, different discharge scenarios are considered to show how the parameters of used algorithms affect the effectiveness of the proposed system. This research is a part of the SYSTEM project (SYnergy of integrated Sensors and Technologies for urban sEcured environMent).Keywords: continuous monitoring, information fusion and sensors, internet of things, multisensor fusion
Procedia PDF Downloads 1121526 Implementation of CNV-CH Algorithm Using Map-Reduce Approach
Authors: Aishik Deb, Rituparna Sinha
Abstract:
We have developed an algorithm to detect the abnormal segment/"structural variation in the genome across a number of samples. We have worked on simulated as well as real data from the BAM Files and have designed a segmentation algorithm where abnormal segments are detected. This algorithm aims to improve the accuracy and performance of the existing CNV-CH algorithm. The next-generation sequencing (NGS) approach is very fast and can generate large sequences in a reasonable time. So the huge volume of sequence information gives rise to the need for Big Data and parallel approaches of segmentation. Therefore, we have designed a map-reduce approach for the existing CNV-CH algorithm where a large amount of sequence data can be segmented and structural variations in the human genome can be detected. We have compared the efficiency of the traditional and map-reduce algorithms with respect to precision, sensitivity, and F-Score. The advantages of using our algorithm are that it is fast and has better accuracy. This algorithm can be applied to detect structural variations within a genome, which in turn can be used to detect various genetic disorders such as cancer, etc. The defects may be caused by new mutations or changes to the DNA and generally result in abnormally high or low base coverage and quantification values.Keywords: cancer detection, convex hull segmentation, map reduce, next generation sequencing
Procedia PDF Downloads 1291525 Multichannel Scheme under Fairness Environment for Cognitive Radio Networks
Authors: Hans Marquez Ramos, Cesar Hernandez, Ingrid Páez
Abstract:
This paper develops a multiple channel assignment model, which allows to take advantage in most efficient way, spectrum opportunities in cognitive radio networks. Developed scheme allows make several available and frequency adjacent channel assignments, which require a bigger wide band, under an equality environment. The hybrid assignment model it is made by to algorithms, one who makes the ranking and select available frequency channels and the other one in charge of establishing an equality criteria, in order to not restrict spectrum opportunities for all other secondary users who wish to make transmissions. Measurements made were done for average bandwidth, average delay, as well fairness computation for several channel assignment. Reached results were evaluated with experimental spectrum occupational data from GSM frequency band captured. Developed model, shows evidence of improvement in spectrum opportunity use and a wider average transmit bandwidth for each secondary user, maintaining equality criteria in channel assignment.Keywords: bandwidth, fairness, multichannel, secondary users
Procedia PDF Downloads 4991524 Book Exchange System with a Hybrid Recommendation Engine
Authors: Nilki Upathissa, Torin Wirasinghe
Abstract:
This solution addresses the challenges faced by traditional bookstores and the limitations of digital media, striking a balance between the tactile experience of printed books and the convenience of modern technology. The book exchange system offers a sustainable alternative, empowering users to access a diverse range of books while promoting community engagement. The user-friendly interfaces incorporated into the book exchange system ensure a seamless and enjoyable experience for users. Intuitive features for book management, search, and messaging facilitate effortless exchanges and interactions between users. By streamlining the process, the system encourages readers to explore new books aligned with their interests, enhancing the overall reading experience. Central to the system's success is the hybrid recommendation engine, which leverages advanced technologies such as Long Short-Term Memory (LSTM) models. By analyzing user input, the engine accurately predicts genre preferences, enabling personalized book recommendations. The hybrid approach integrates multiple technologies, including user interfaces, machine learning models, and recommendation algorithms, to ensure the accuracy and diversity of the recommendations. The evaluation of the book exchange system with the hybrid recommendation engine demonstrated exceptional performance across key metrics. The high accuracy score of 0.97 highlights the system's ability to provide relevant recommendations, enhancing users' chances of discovering books that resonate with their interests. The commendable precision, recall, and F1score scores further validate the system's efficacy in offering appropriate book suggestions. Additionally, the curve classifications substantiate the system's effectiveness in distinguishing positive and negative recommendations. This metric provides confidence in the system's ability to navigate the vast landscape of book choices and deliver recommendations that align with users' preferences. Furthermore, the implementation of this book exchange system with a hybrid recommendation engine has the potential to revolutionize the way readers interact with printed books. By facilitating book exchanges and providing personalized recommendations, the system encourages a sense of community and exploration within the reading community. Moreover, the emphasis on sustainability aligns with the growing global consciousness towards eco-friendly practices. With its robust technical approach and promising evaluation results, this solution paves the way for a more inclusive, accessible, and enjoyable reading experience for book lovers worldwide. In conclusion, the developed book exchange system with a hybrid recommendation engine represents a progressive solution to the challenges faced by traditional bookstores and the limitations of digital media. By promoting sustainability, widening access to printed books, and fostering engagement with reading, this system addresses the evolving needs of book enthusiasts. The integration of user-friendly interfaces, advanced machine learning models, and recommendation algorithms ensure accurate and diverse book recommendations, enriching the reading experience for users.Keywords: recommendation systems, hybrid recommendation systems, machine learning, data science, long short-term memory, recurrent neural network
Procedia PDF Downloads 871523 Performance Analysis of a Combined Ordered Successive and Interference Cancellation Using Zero-Forcing Detection over Rayleigh Fading Channels in Mimo Systems
Authors: Jamal R. Elbergali
Abstract:
Multiple Input Multiple Output (MIMO) systems are wireless systems with multiple antenna elements at both ends of the link. Wireless communication systems demand high data rate and spectral efficiency with increased reliability. MIMO systems have been popular techniques to achieve these goals because increased data rate is possible through spatial multiplexing scheme and diversity. Spatial Multiplexing (SM) is used to achieve higher possible throughput than diversity. In this paper, we propose a Zero-Forcing (ZF) detection using a combination of Ordered Successive Interference Cancellation (OSIC) and Zero Forcing using Interference Cancellation (ZF-IC). The proposed method used an OSIC based on Signal to Noise Ratio (SNR) ordering to get the estimation of last symbol (x ̃_(N_T )), then the estimated last symbol is considered to be an input to the ZF-IC. We analyze the Bit Error Rate (BER) performance of the proposed MIMO system over Rayleigh Fading Channel, using Binary Phase Shift Keying (BPSK) modulation scheme. The results show better performance than the previous methods.Keywords: SNR, BER, BPSK, MIMO, modulation, zero forcing (ZF), OSIC, ZF-IC, spatial multiplexing (SM)
Procedia PDF Downloads 4211522 Study of Sub-Surface Flow in an Unconfined Carbonate Aquifer in a Tropical Karst Area in Indonesia: A Modeling Approach Using Finite Difference Groundwater Model
Authors: Dua K. S. Y. Klaas, Monzur A. Imteaz, Ika Sudiayem, Elkan M. E. Klaas, Eldav C. M. Klaas
Abstract:
Due to its porous nature, karst terrains – geomorphologically developed from dissolved formations, is vulnerable to water shortage and deteriorated water quality. Therefore, a solid comprehension on sub-surface flow of karst landscape is essential to assess the long-term availability of groundwater resources. In this paper, a single-continuum model using a finite difference model, MODLFOW, was constructed to represent an unconfined carbonate aquifer in a tropical karst island of Rote in Indonesia. The model, spatially discretized in 20 x 20 m grid cells, was calibrated and validated using available groundwater level and atmospheric variables. In the calibration and validation steps, Parameter Estimation (PEST) and geostatistical pilot point methods were employed to estimate hydraulic conductivity and specific yield values. The results show that the model is able to represent the sub-surface flow indicated by good model performances both in calibration and validation steps. The final model can be used as a robust representation of the system for future study on climate and land use scenarios.Keywords: carbonate aquifer, karst, sub-surface flow, groundwater model
Procedia PDF Downloads 1451521 Automated Tracking and Statistics of Vehicles at the Signalized Intersection
Authors: Qiang Zhang, Xiaojian Hu1
Abstract:
Intersection is the place where vehicles and pedestrians must pass through, turn and evacuate. Obtaining the motion data of vehicles near the intersection is of great significance for transportation research. Since there are usually many targets and there are more conflicts between targets, this makes it difficult to obtain vehicle motion parameters in traffic videos of intersections. According to the characteristics of traffic videos, this paper applies video technology to realize the automated track, count and trajectory extraction of vehicles to collect traffic data by roadside surveillance cameras installed near the intersections. Based on the video recognition method, the vehicles in each lane near the intersection are tracked with extracting trajectory and counted respectively in various degrees of occlusion and visibility. The performances are compared with current recognized CPU-based algorithms of real-time tracking-by-detection. The speed of the presented system is higher than the others and the system has a better real-time performance. The accuracy of direction has reached about 94.99% on average, and the accuracy of classification and statistics has reached about 75.12% on average.Keywords: tracking and statistics, vehicle, signalized intersection, motion parameter, trajectory
Procedia PDF Downloads 2141520 Engineering Seismological Studies in and around Zagazig City, Sharkia, Egypt
Authors: M. El-Eraki, A. A. Mohamed, A. A. El-Kenawy, M. S. Toni, S. I. Mustafa
Abstract:
The aim of this paper is to study the ground vibrations using Nakamura technique to evaluate the relation between the ground conditions and the earthquake characteristics. Microtremor measurements were carried out at 55 sites in and around Zagazig city. The signals were processed using horizontal to vertical spectral ratio (HVSR) technique to estimate the fundamental frequencies of the soil deposits and its corresponding H/V amplitude. Seismic measurements were acquired at nine sites for recording the surface waves. The recorded waveforms were processed using the multi-channel analysis of surface waves (MASW) method to infer the shear wave velocity profile. The obtained fundamental frequencies were found to be ranging from 0.7 to 1.7 Hz and the maximum H/V amplitude reached 6.4. These results together with the average shear wave velocity in the surface layers were used for the estimation of the thickness of the upper most soft cover layers (depth to bedrock). The sediment thickness generally increases at the northeastern and southwestern parts of the area, which is in good agreement with the local geological structure. The results of this work showed the zones of higher potential damage in the event of an earthquake in the study area.Keywords: ambient vibrations, fundamental frequency, surface waves, zagazig
Procedia PDF Downloads 2801519 Scene Classification Using Hierarchy Neural Network, Directed Acyclic Graph Structure, and Label Relations
Authors: Po-Jen Chen, Jian-Jiun Ding, Hung-Wei Hsu, Chien-Yao Wang, Jia-Ching Wang
Abstract:
A more accurate scene classification algorithm using label relations and the hierarchy neural network was developed in this work. In many classification algorithms, it is assumed that the labels are mutually exclusive. This assumption is true in some specific problems, however, for scene classification, the assumption is not reasonable. Because there are a variety of objects with a photo image, it is more practical to assign multiple labels for an image. In this paper, two label relations, which are exclusive relation and hierarchical relation, were adopted in the classification process to achieve more accurate multiple label classification results. Moreover, the hierarchy neural network (hierarchy NN) is applied to classify the image and the directed acyclic graph structure is used for predicting a more reasonable result which obey exclusive and hierarchical relations. Simulations show that, with these techniques, a much more accurate scene classification result can be achieved.Keywords: convolutional neural network, label relation, hierarchy neural network, scene classification
Procedia PDF Downloads 4531518 A Novel RLS Based Adaptive Filtering Method for Speech Enhancement
Authors: Pogula Rakesh, T. Kishore Kumar
Abstract:
Speech enhancement is a long standing problem with numerous applications like teleconferencing, VoIP, hearing aids, and speech recognition. The motivation behind this research work is to obtain a clean speech signal of higher quality by applying the optimal noise cancellation technique. Real-time adaptive filtering algorithms seem to be the best candidate among all categories of the speech enhancement methods. In this paper, we propose a speech enhancement method based on Recursive Least Squares (RLS) adaptive filter of speech signals. Experiments were performed on noisy data which was prepared by adding AWGN, Babble and Pink noise to clean speech samples at -5dB, 0dB, 5dB, and 10dB SNR levels. We then compare the noise cancellation performance of proposed RLS algorithm with existing NLMS algorithm in terms of Mean Squared Error (MSE), Signal to Noise ratio (SNR), and SNR loss. Based on the performance evaluation, the proposed RLS algorithm was found to be a better optimal noise cancellation technique for speech signals.Keywords: adaptive filter, adaptive noise canceller, mean squared error, noise reduction, NLMS, RLS, SNR, SNR loss
Procedia PDF Downloads 4781517 Development of Fake News Model Using Machine Learning through Natural Language Processing
Authors: Sajjad Ahmed, Knut Hinkelmann, Flavio Corradini
Abstract:
Fake news detection research is still in the early stage as this is a relatively new phenomenon in the interest raised by society. Machine learning helps to solve complex problems and to build AI systems nowadays and especially in those cases where we have tacit knowledge or the knowledge that is not known. We used machine learning algorithms and for identification of fake news; we applied three classifiers; Passive Aggressive, Naïve Bayes, and Support Vector Machine. Simple classification is not completely correct in fake news detection because classification methods are not specialized for fake news. With the integration of machine learning and text-based processing, we can detect fake news and build classifiers that can classify the news data. Text classification mainly focuses on extracting various features of text and after that incorporating those features into classification. The big challenge in this area is the lack of an efficient way to differentiate between fake and non-fake due to the unavailability of corpora. We applied three different machine learning classifiers on two publicly available datasets. Experimental analysis based on the existing dataset indicates a very encouraging and improved performance.Keywords: fake news detection, natural language processing, machine learning, classification techniques.
Procedia PDF Downloads 1621516 Digital Mapping as a Tool for Finding Cities' DNA
Authors: Sanja Peter
Abstract:
Transformation of urban environments can be compared to evolutionary processes. Systematic digital mapping of historical data can enable capturing some of these processes and their outcomes. For example, it may help reveal the structure of a city’s historical DNA. Gathering historical data for automatic processing may be giving a basis for cultural algorithms. Gothenburg City museum is trying to make city’s heritage information accessible through GIS-platforms and is now partnering with academic institutions to find appropriate methods to make accessible the knowledge on the city’s historical fabric. Hopefully, this will be carried out through a project called Digital Twin Cities. One part of this large project, concerning matters of Cultural Heritage, will be in collaboration with Chalmers University of Technology. The aim is to create a layered map showing historical developments of the city and extracting quantitative data about its built heritage, above and below the earth. It will allow interpreting the information from historic maps through, for example, names of the streets/places, geography, structural changes in urban fabric and information gathered by archaeologists’ excavations. Through the study of these geographical, historical and local metamorphoses, urban environment will reveal its metaphorical DNA or its MEM (Dawkins).Keywords: Gothenburg, mapping, cultural heritage, city history
Procedia PDF Downloads 138