Search results for: random common fixed point theorem
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12830

Search results for: random common fixed point theorem

12560 Dynamical Relation of Poisson Spike Trains in Hodkin-Huxley Neural Ion Current Model and Formation of Non-Canonical Bases, Islands, and Analog Bases in DNA, mRNA, and RNA at or near the Transcription

Authors: Michael Fundator

Abstract:

Groundbreaking application of biomathematical and biochemical research in neural networks processes to formation of non-canonical bases, islands, and analog bases in DNA and mRNA at or near the transcription that contradicts the long anticipated statistical assumptions for the distribution of bases and analog bases compounds is implemented through statistical and stochastic methods apparatus with addition of quantum principles, where the usual transience of Poisson spike train becomes very instrumental tool for finding even almost periodical type of solutions to Fokker-Plank stochastic differential equation. Present article develops new multidimensional methods of finding solutions to stochastic differential equations based on more rigorous approach to mathematical apparatus through Kolmogorov-Chentsov continuity theorem that allows the stochastic processes with jumps under certain conditions to have γ-Holder continuous modification that is used as basis for finding analogous parallels in dynamics of neutral networks and formation of analog bases and transcription in DNA.

Keywords: Fokker-Plank stochastic differential equation, Kolmogorov-Chentsov continuity theorem, neural networks, translation and transcription

Procedia PDF Downloads 370
12559 Social Accountability: Persuasion and Debate to Contain Corruption

Authors: A. Lambert-Mogiliansky

Abstract:

In this paper, we investigate the properties of simple rules for reappointment aimed at holding a public official accountable and monitor his activity. The public official allocates budget resources to various activities which results in the delivery of public services to citizens. He has discretion over the use of resource so he can divert some of them for private ends. Because of a liability constraint, zero diversion can never be secured in all states. The optimal reappointment mechanism under complete information is shown to exhibit some leniency thus departing from the zero tolerance principle. Under asymmetric information (about the state), a rule with random verification in a pre-announced subset is shown to be optimal in a class of common rules. Surprisingly, those common rules make little use of hard information about service delivery when available. Similarly, PO's claim about his record is of no value to improve the performance of the examined rules. In contrast requesting that the PO defends his records publicly can be very useful if the service users are given the chance to refute false claims with cheap talk complaints: the first best complete information outcome can be approached in the absence of any observation by the manager of the accountability mechanism.

Keywords: accountability, corruption, persuasion, debate

Procedia PDF Downloads 354
12558 Verifiable Secure Computation of Large Scale Two-Point Boundary Value Problems Using Certificate Validation

Authors: Yogita M. Ahire, Nedal M. Mohammed, Ahmed A. Hamoud

Abstract:

Scientific computation outsourcing is gaining popularity because it allows customers with limited computing resources and storage devices to outsource complex computation workloads to more powerful service providers. However, it raises some security and privacy concerns and challenges, such as customer input and output privacy, as well as cloud cheating behaviors. This study was motivated by these concerns and focused on privacy-preserving Two-Point Boundary Value Problems (BVP) as a common and realistic instance for verifiable safe multiparty computing. We'll look at the safe and verifiable schema with correctness guarantees by utilizing standard multiparty approaches to compute the result of a computation and then solely using verifiable ways to check that the result was right.

Keywords: verifiable computing, cloud computing, secure and privacy BVP, secure computation outsourcing

Procedia PDF Downloads 63
12557 A Graph Theoretic Algorithm for Bandwidth Improvement in Computer Networks

Authors: Mehmet Karaata

Abstract:

Given two distinct vertices (nodes) source s and target t of a graph G = (V, E), the two node-disjoint paths problem is to identify two node-disjoint paths between s ∈ V and t ∈ V . Two paths are node-disjoint if they have no common intermediate vertices. In this paper, we present an algorithm with O(m)-time complexity for finding two node-disjoint paths between s and t in arbitrary graphs where m is the number of edges. The proposed algorithm has a wide range of applications in ensuring reliability and security of sensor, mobile and fixed communication networks.

Keywords: disjoint paths, distributed systems, fault-tolerance, network routing, security

Procedia PDF Downloads 416
12556 ATC in Competitive Electricity Market Using TCSC

Authors: S. K. Gupta, Richa Bansal

Abstract:

In a deregulated power system structure, power producers, and customers share a common transmission network for wheeling power from the point of generation to the point of consumption. All parties in this open access environment may try to purchase the energy from the cheaper source for greater profit margins, which may lead to overloading and congestion of certain corridors of the transmission network. This may result in violation of line flow, voltage and stability limits and thereby undermine the system security. Utilities therefore need to determine adequately their Available Transfer Capability (ATC) to ensure that system reliability is maintained while serving a wide range of bilateral and multilateral transactions. This paper presents power transfer distribution factor based on AC load flow for the determination and enhancement of ATC. The study has been carried out for IEEE 24 bus Reliability Test System.

Keywords: available transfer capability, FACTS devices, power transfer distribution factors, electric

Procedia PDF Downloads 473
12555 Evolutionary Methods in Cryptography

Authors: Wafa Slaibi Alsharafat

Abstract:

Genetic algorithms (GA) are random algorithms as random numbers that are generated during the operation of the algorithm determine what happens. This means that if GA is applied twice to optimize exactly the same problem it might produces two different answers. In this project, we propose an evolutionary algorithm and Genetic Algorithm (GA) to be implemented in symmetric encryption and decryption. Here, user's message and user secret information (key) which represent plain text to be transferred into cipher text.

Keywords: GA, encryption, decryption, crossover

Procedia PDF Downloads 414
12554 Study of Bored Pile Retaining Wall Using Physical Modeling

Authors: Amin Eslami, Jafar Bolouri Bazaz

Abstract:

Excavation and retaining walls are of challenging issues in civil engineering. In this study, the behavior of one the important type of supporting systems called Contiguous Bored Pile (CBP) retaining wall is investigated using a physical model. Besides, a comparison is made between two modes of free end piles(soft bed) and fixed end piles (stiff bed). Also a back calculation of effective length (the real free length of pile) is done by measuring lateral deflection of piles in different stages of excavation in both a forementioned cases. Based on observed results, for the fixed end mode, the effective length to free length ratio (Leff/L0) is equal to unity in initial stages of excavation and less than 1 in its final stages in a decreasing manner. While this ratio for free end mode, remains constant during all stages of excavation and is always less than unity.

Keywords: contiguous bored pile wall, effective length, fixed end, free end, free length

Procedia PDF Downloads 366
12553 An Authentic Algorithm for Ciphering and Deciphering Called Latin Djokovic

Authors: Diogen Babuc

Abstract:

The question that is a motivation of writing is how many devote themselves to discovering something in the world of science where much is discerned and revealed, but at the same time, much is unknown. Methods: The insightful elements of this algorithm are the ciphering and deciphering algorithms of Playfair, Caesar, and Vigenère. Only a few of their main properties are taken and modified, with the aim of forming a specific functionality of the algorithm called Latin Djokovic. Specifically, a string is entered as input data. A key k is given, with a random value between the values a and b = a+3. The obtained value is stored in a variable with the aim of being constant during the run of the algorithm. In correlation to the given key, the string is divided into several groups of substrings, and each substring has a length of k characters. The next step involves encoding each substring from the list of existing substrings. Encoding is performed using the basis of Caesar algorithm, i.e., shifting with k characters. However, that k is incremented by 1 when moving to the next substring in that list. When the value of k becomes greater than b+1, it’ll return to its initial value. The algorithm is executed, following the same procedure, until the last substring in the list is traversed. Results: Using this polyalphabetic method, ciphering and deciphering of strings are achieved. The algorithm also works for a 100-character string. The x character isn’t used when the number of characters in a substring is incompatible with the expected length. The algorithm is simple to implement, but it’s questionable if it works better than the other methods from the point of view of execution time and storage space.

Keywords: ciphering, deciphering, authentic, algorithm, polyalphabetic cipher, random key, methods comparison

Procedia PDF Downloads 78
12552 Evaluation of the Appropriateness of Common Oxidants for Ruthenium (II) Chemiluminescence in a Microfluidic Detection Device Coupled to Microbore High Performance Liquid Chromatography for the Analysis of Drugs in Formulations and Biological Fluids

Authors: Afsal Mohammed Kadavilpparampu, Haider A. J. Al Lawati, Fakhr Eldin O. Suliman, Salma M. Z. Al Kindy

Abstract:

In this work, we evaluated the appropriateness of various oxidants that can be used potentially with Ru(bipy)32+ CL system while performing CL detection in a microfluidic device using eight common active pharmaceutical ingredients- ciprofloxacin, hydrochlorothiazide, norfloxacin, buspirone, fexofenadine, cetirizine, codeine, and dextromethorphan. This is because, microfludics have very small channel volume and the residence time is also very short. Hence, a highly efficient oxidant is required for on-chip CL detection to obtain analytically acceptable CL emission. Three common oxidants were evaluated, lead dioxide, cerium ammonium sulphate and ammonium peroxydisulphate. Results obtained showed that ammonium peroxydisulphate is the most appropriate oxidant which can be used in microfluidic setup and all the tested analyte give strong CL emission while using this oxidant. We also found that Ru(bipy)33+ generated off-line by oxidizing [Ru(bipy)3]Cl2.6H2O in acetonitrile under acidic condition with lead dioxide was stable for more than 72 hrs. A highly sensitive microbore HPLC- CL method using ammonium peroxydisulphate as an oxidant in a microfluidic on-chip CL detection has been developed for the analyses of fixed-dose combinations of pseudoephedrine (PSE), fexofenadine (FEX) and cetirizine (CIT) in biological fluids and pharmaceutical formulations with minimum sample pre-treatment.

Keywords: oxidants, microbore High Performance Liquid Chromatography, chemiluminescence, microfluidics

Procedia PDF Downloads 412
12551 Forecasting the Fluctuation of Currency Exchange Rate Using Random Forest

Authors: Lule Basha, Eralda Gjika

Abstract:

The exchange rate is one of the most important economic variables, especially for a small, open economy such as Albania. Its effect is noticeable in one country's competitiveness, trade and current account, inflation, wages, domestic economic activity, and bank stability. This study investigates the fluctuation of Albania’s exchange rates using monthly average foreign currency, Euro (Eur) to Albanian Lek (ALL) exchange rate with a time span from January 2008 to June 2021, and the macroeconomic factors that have a significant effect on the exchange rate. Initially, the Random Forest Regression algorithm is constructed to understand the impact of economic variables on the behavior of monthly average foreign currencies exchange rates. Then the forecast of macro-economic indicators for 12 months was performed using time series models. The predicted values received are placed in the random forest model in order to obtain the average monthly forecast of the Euro to Albanian Lek (ALL) exchange rate for the period July 2021 to June 2022.

Keywords: exchange rate, random forest, time series, machine learning, prediction

Procedia PDF Downloads 71
12550 Optical Flow Based System for Cross Traffic Alert

Authors: Giuseppe Spampinato, Salvatore Curti, Ivana Guarneri, Arcangelo Bruna

Abstract:

This document describes an advanced system and methodology for Cross Traffic Alert (CTA), able to detect vehicles that move into the vehicle driving path from the left or right side. The camera is supposed to be not only on a vehicle still, e.g. at a traffic light or at an intersection, but also moving slowly, e.g. in a car park. In all of the aforementioned conditions, a driver’s short loss of concentration or distraction can easily lead to a serious accident. A valid support to avoid these kinds of car crashes is represented by the proposed system. It is an extension of our previous work, related to a clustering system, which only works on fixed cameras. Just a vanish point calculation and simple optical flow filtering, to eliminate motion vectors due to the car relative movement, is performed to let the system achieve high performances with different scenarios, cameras and resolutions. The proposed system just uses as input the optical flow, which is hardware implemented in the proposed platform and since the elaboration of the whole system is really speed and power consumption, it is inserted directly in the camera framework, allowing to execute all the processing in real-time.

Keywords: clustering, cross traffic alert, optical flow, real time, vanishing point

Procedia PDF Downloads 172
12549 Mobile Learning: Toward Better Understanding of Compression Techniques

Authors: Farouk Lawan Gambo

Abstract:

Data compression shrinks files into fewer bits then their original presentation. It has more advantage on internet because the smaller a file, the faster it can be transferred but learning most of the concepts in data compression are abstract in nature therefore making them difficult to digest by some students (Engineers in particular). To determine the best approach toward learning data compression technique, this paper first study the learning preference of engineering students who tend to have strong active, sensing, visual and sequential learning preferences, the paper also study the advantage that mobility of learning have experienced; Learning at the point of interest, efficiency, connection, and many more. A survey is carried out with some reasonable number of students, through random sampling to see whether considering the learning preference and advantages in mobility of learning will give a promising improvement over the traditional way of learning. Evidence from data analysis using Ms-Excel as a point of concern for error-free findings shows that there is significance different in the students after using learning content provided on smart phone, also the result of the findings presented in, bar charts and pie charts interpret that mobile learning has to be promising feature of learning.

Keywords: data analysis, compression techniques, learning content, traditional learning approach

Procedia PDF Downloads 323
12548 Mathematical Based Forecasting of Heart Attack

Authors: Razieh Khalafi

Abstract:

Myocardial infarction (MI) or acute myocardial infarction (AMI), commonly known as a heart attack, occurs when blood flow stops to part of the heart causing damage to the heart muscle. An ECG can often show evidence of a previous heart attack or one that's in progress. The patterns on the ECG may indicate which part of your heart has been damaged, as well as the extent of the damage. In chaos theory, the correlation dimension is a measure of the dimensionality of the space occupied by a set of random points, often referred to as a type of fractal dimension. In this research by considering ECG signal as a random walk we work on forecasting the oncoming heart attack by analyzing the ECG signals using the correlation dimension. In order to test the model a set of ECG signals for patients before and after heart attack was used and the strength of model for forecasting the behavior of these signals were checked. Results shows this methodology can forecast the ECG and accordingly heart attack with high accuracy.

Keywords: heart attack, ECG, random walk, correlation dimension, forecasting

Procedia PDF Downloads 504
12547 The Predictability of Three Implants to Support a Fixed Prosthesis in the Edentulous Mandible

Authors: M. Hirani, M. Devine, O. Obisesan, C. Bryant

Abstract:

Introduction: The use of four or more implants to support a fixed prosthesis in the edentulous mandible is well documented, with high levels of clinical outcomes recorded. Despite this, the use of three implant-supported fixed prostheses offers the potential to deliver a more cost-effective method of oral rehabilitation in the lower arch, an important consideration given that edentulism is most prevalent in low-income subpopulations. The purpose of this study aimed to evaluate the implant and prosthetic survival rate, changes in marginal bone level, and patient satisfaction associated with a three-implant-supported fixed prosthesis for rehabilitation of the edentulous mandible over a follow-up period of at least one year. Methods: A comprehensive literature search was performed to evaluate studies that met the selection criteria. The information extracted included the study design and population, participant demographics, observation period, loading protocol, and the number of implants placed together with the required outcome measures. Mean values and standard deviations (SD) were calculated using SPSS® (IBM Corporation, New York, USA), and the level of statistical significance across all comparative studies described was set at P < 0.05. Results: The eligible studies included a total of 1968 implants that were placed in 652 patients. The subjects ranged in age from 33-89 years, with a mean of 63.2 years. The mean cumulative implant and prosthetic survival rates were 95.5% and 96.2%, respectively, over a mean follow-up period of 3.25 years. The mean marginal bone loss recorded was 1.04 mm, and high patient satisfaction rates were reported across the studies. Conclusion: Current evidence suggests that a three implant-supported fixed prosthesis for the edentulous mandible is a successful treatment strategy presenting high implant and prosthetic survival rates over the short-to-medium term. Further well-designed controlled clinical trials are required to evaluate longer-term outcomes, with supplemental data correlating implant dimensions and prosthetic design.

Keywords: implants, mandible, fixed, prosthesis

Procedia PDF Downloads 108
12546 Classification of Random Doppler-Radar Targets during the Surveillance Operations

Authors: G. C. Tikkiwal, Mukesh Upadhyay

Abstract:

During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving the army, moving convoys etc. The radar operator selects one of the promising targets into single target tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper, we present a technique using mathematical and statistical methods like fast fourier transformation (FFT) and principal component analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.

Keywords: radar target, FFT, principal component analysis, eigenvector, octave-notes, DSP

Procedia PDF Downloads 369
12545 The Role of Human Capital in the Evolution of Inequality and Economic Growth in Latin-America

Authors: Luis Felipe Brito-Gaona, Emma M. Iglesias

Abstract:

There is a growing literature that studies the main determinants and drivers of inequality and economic growth in several countries, using panel data and different estimation methods (fixed effects, Generalized Methods of Moments (GMM) and Two Stages Least Squares (TSLS)). Recently, it was studied the evolution of these variables in the period 1980-2009 in the 18 countries of Latin-America and it was found that one of the main variables that explained their evolution was Foreign Direct Investment (FDI). We extend this study to the year 2015 in the same 18 countries in Latin-America, and we find that FDI does not have a significant role anymore, while we find a significant negative and positive effect of schooling levels on inequality and economic growth respectively. We also find that the point estimates associated with human capital are the largest ones of the variables included in the analysis, and this means that an increase in human capital (measured by schooling levels of secondary education) is the main determinant that can help to reduce inequality and to increase economic growth in Latin-America. Therefore, we advise that economic policies in Latin-America should be directed towards increasing the level of education. We use the methodologies of estimating by fixed effects, GMM and TSLS to check the robustness of our results. Our conclusion is the same regardless of the estimation method we choose. We also find that the international recession in the Latin-American countries in 2008 reduced significantly their economic growth.

Keywords: economic growth, human capital, inequality, Latin-America

Procedia PDF Downloads 198
12544 Temporal Fixed Effects: The Macroeconomic Implications on Industry Return

Authors: Mahdy Elhusseiny, Richard Gearhart, Mariam Alyammahi

Abstract:

In this study we analyse the impact of a number of major macroeconomic variables on industry-specific excess rates of return. In later specifications, we include time and recession fixed effects, to potentially capture time-specific trends that may have been changing over our panel. We have a number of results that bear mentioning. Seasonal and temporal factors found to have very large role in sector-specific excess returns. Increases in M1(money supply) decreases bank, insurance, real estate, and telecommunications, while increases industrial and transportation excess returns. The results indicate that the market return increases every sector-specific rate of return. The 2007 to 2009 recession significantly reduced excess returns in the bank, real estate, and transportation sectors.

Keywords: macroeconomic factors, industry returns, fixed effects, temporal factors

Procedia PDF Downloads 51
12543 Solving Process Planning and Scheduling with Number of Operation Plus Processing Time Due-Date Assignment Concurrently Using a Genetic Search

Authors: Halil Ibrahim Demir, Alper Goksu, Onur Canpolat, Caner Erden, Melek Nur

Abstract:

Traditionally process planning, scheduling and due date assignment are performed sequentially and separately. High interrelation between these functions makes integration very useful. Although there are numerous works on integrated process planning and scheduling and many works on scheduling with due date assignment, there are only a few works on the integration of these three functions. Here we tested the different integration levels of these three functions and found a fully integrated version as the best. We applied genetic search and random search and genetic search was found better compared to the random search. We penalized all earliness, tardiness and due date related costs. Since all these three terms are all undesired, it is better to penalize all of them.

Keywords: process planning, scheduling, due-date assignment, genetic algorithm, random search

Procedia PDF Downloads 351
12542 Daily Variations of Polycyclic Aromatic Hydrocarbons (PAHs) in Industrial Sites in an Suburban Area of Sour El Ghozlane, Algeria

Authors: Sidali Khedidji, Noureddine Yassaa, Riad Ladji

Abstract:

In this study, n-alkanes which are hazardous for the environment and human health were investigated in Sour El Ghozlane suburban atmosphere at a sampling point from April 2013 to Mai 2013. Ambient concentration measurements of n-Alkanes were carried out at a regional study of the cement industry in Sour El Ghozlane. During sampling, the airborne particulate matter was enriched onto PTFE filters by using a two medium volume samplers with or without a size-selective inlet for PM10 and TSP were used and each sampling period lasted approximately 24 h. The organic compounds were characterized using gas chromatography coupled with mass spectrometric detection (GC-MS). Total concentrations for n-Alkanes recorded in Sour El Ghozlane suburban ranged from 42 to 69 ng m-3. Gravimeter method was applied to the black smoke concentration data for Springer seasons. The 24 h average concentrations of n-alkanes contain the PM10 and TSP of Sour El Ghozlane suburban atmosphere were found in the range 0.50–7.06 ng/m3 and 0.29–6.97 ng/m3, respectively, in the sampling period. Meteorological factors, such as (relative humidity and temperature) were typically found to be affecting PMs, especially PM10. Air temperature did not seem to be significantly affecting TSP and PM10 mass concentrations. The guide value fixed by the European Community, 40 μg/m3 was not to exceed 35 days, was exceeded in some samples. However, it should be noted that the value limit fixed by the Algerian regulations 80 μg/m3 has been exceeded in 1 sampler during the period study.

Keywords: n-alkanes, PM10, TSP, particulate matter, cement industry

Procedia PDF Downloads 373
12541 Constant Factor Approximation Algorithm for p-Median Network Design Problem with Multiple Cable Types

Authors: Chaghoub Soraya, Zhang Xiaoyan

Abstract:

This research presents the first constant approximation algorithm to the p-median network design problem with multiple cable types. This problem was addressed with a single cable type and there is a bifactor approximation algorithm for the problem. To the best of our knowledge, the algorithm proposed in this paper is the first constant approximation algorithm for the p-median network design with multiple cable types. The addressed problem is a combination of two well studied problems which are p-median problem and network design problem. The introduced algorithm is a random sampling approximation algorithm of constant factor which is conceived by using some random sampling techniques form the literature. It is based on a redistribution Lemma from the literature and a steiner tree problem as a subproblem. This algorithm is simple, and it relies on the notions of random sampling and probability. The proposed approach gives an approximation solution with one constant ratio without violating any of the constraints, in contrast to the one proposed in the literature. This paper provides a (21 + 2)-approximation algorithm for the p-median network design problem with multiple cable types using random sampling techniques.

Keywords: approximation algorithms, buy-at-bulk, combinatorial optimization, network design, p-median

Procedia PDF Downloads 165
12540 A New Mathematical Method for Heart Attack Forecasting

Authors: Razi Khalafi

Abstract:

Myocardial Infarction (MI) or acute Myocardial Infarction (AMI), commonly known as a heart attack, occurs when blood flow stops to part of the heart causing damage to the heart muscle. An ECG can often show evidence of a previous heart attack or one that's in progress. The patterns on the ECG may indicate which part of your heart has been damaged, as well as the extent of the damage. In chaos theory, the correlation dimension is a measure of the dimensionality of the space occupied by a set of random points, often referred to as a type of fractal dimension. In this research by considering ECG signal as a random walk we work on forecasting the oncoming heart attack by analysing the ECG signals using the correlation dimension. In order to test the model a set of ECG signals for patients before and after heart attack was used and the strength of model for forecasting the behaviour of these signals were checked. Results show this methodology can forecast the ECG and accordingly heart attack with high accuracy.

Keywords: heart attack, ECG, random walk, correlation dimension, forecasting

Procedia PDF Downloads 467
12539 Comparison Between Tension Band Wiring Using K-Wires and Cannulated Screws in Transverse Patella Fracture Fixation

Authors: Daniel Francis, Mo Yassin

Abstract:

Transverse patella fractures are routinely fixed using tension band wiring (TBW) using Kirschner wires and a wire in the shape of a figure of 8. The idea of the study was to compare the outcomes of the traditional technique against the more recently used cannulated screws and fiber tape in the shape of a figure of 8. We performed a retrospective cohort study of all the surgically fixed patella fractures from the year 2019 to 2022. The patients were divided into two groups TBW group and cannulated screws group. The primary outcome measure was the failure of fixation and the need for the removal of metalwork. Twenty-six patellar fractures were studied. TBW was used in 14 (53.8%), and cannulated screws were used for fixation in 12 (46.2%). There was one incident of metalwork failure in the TBW and one incident in the cannulated screws group. Five (35.7%) of patients in the TBW needed symptomatic metal work removed and One (8.3%) in the cannulated screw group. In both groups, the rate of fixation failure was low. Symptomatic implants, the most common complication observed, were higher in the TBW group in our practice. Although the small numbers in both groups, the hope of this study is to shine the light on the use of cannulated screws for patella fractures as it would reduce the need for a second operation and reduce the load on the already stretched services as well as improving the patient experience by not requiring further surgery. Although this is not a brand-new technique, it is not commonly used as there have not yet been any studies that demonstrate the lower rates of second surgery needed.

Keywords: patella, tension band wiring, randomised, new technique

Procedia PDF Downloads 50
12538 Peruvian Diagnostic Reference Levels for Patients Undergoing Different X-Rays Procedures

Authors: Andres Portocarrero Bonifaz, Caterina Sandra Camarena Rodriguez, Ricardo Palma Esparza, Nicolas Antonio Romero Carlos

Abstract:

Reference levels for common X-rays procedures have been set in many protocols. In Peru, during quality control tests, the dose tolerance is set by these international recommendations. Nevertheless, further studies can be made to assess the national reality and relate dose levels with different parameters such as kV, mA/mAs, exposure time, type of processing (digital, digitalized or conventional), etc. In this paper three radiologic procedures were taken into account for study, general X-rays (fixed and mobile), intraoral X-rays (fixed, mobile and portable) and mammography. For this purpose, an Unfors Xi detector was used; the dose was measured at a focus - detector distance which varied depending on the procedure, and was corrected afterward to find the surface entry dose. The data used in this paper was gathered over a period of over 3 years (2015-2018). In addition, each X-ray machine was taken into consideration only once. The results hope to achieve a new standard which reflects the local practice, and address the issues of the ‘Bonn Call for Action’ in Peru. For this purpose, the 75% percentile of the dose of each radiologic procedure was calculated. In future quality control services, those machines with dose values higher than the selected threshold should be informed that they surpass the reference dose levels established in comparison other radiological centers in the country.

Keywords: general X-rays, intraoral X-rays, mammography, reference dose levels

Procedia PDF Downloads 124
12537 Land Cover Classification Using Sentinel-2 Image Data and Random Forest Algorithm

Authors: Thanh Noi Phan, Martin Kappas, Jan Degener

Abstract:

The currently launched Sentinel 2 (S2) satellite (June, 2015) bring a great potential and opportunities for land use/cover map applications, due to its fine spatial resolution multispectral as well as high temporal resolutions. So far, there are handful studies using S2 real data for land cover classification. Especially in northern Vietnam, to our best knowledge, there exist no studies using S2 data for land cover map application. The aim of this study is to provide the preliminary result of land cover classification using Sentinel -2 data with a rising state – of – art classifier, Random Forest. A case study with heterogeneous land use/cover in the eastern of Hanoi Capital – Vietnam was chosen for this study. All 10 spectral bands of 10 and 20 m pixel size of S2 images were used, the 10 m bands were resampled to 20 m. Among several classified algorithms, supervised Random Forest classifier (RF) was applied because it was reported as one of the most accuracy methods of satellite image classification. The results showed that the red-edge and shortwave infrared (SWIR) bands play an important role in land cover classified results. A very high overall accuracy above 90% of classification results was achieved.

Keywords: classify algorithm, classification, land cover, random forest, sentinel 2, Vietnam

Procedia PDF Downloads 350
12536 Sensitivity Analysis of Pile-Founded Fixed Steel Jacket Platforms

Authors: Mohamed Noureldin, Jinkoo Kim

Abstract:

The sensitivity of the seismic response parameters to the uncertain modeling variables of pile-founded fixed steel jacket platforms are investigated using tornado diagram, first-order second-moment, and static pushover analysis techniques. The effects of both aleatory and epistemic uncertainty on seismic response parameters have been investigated for an existing offshore platform. The sources of uncertainty considered in the present study are categorized into three different categories: the uncertainties associated with the soil-pile modeling parameters in clay soil, the platform jacket structure modeling parameters, and the uncertainties related to ground motion excitations. It has been found that the variability in parameters such as yield strength or pile bearing capacity has almost no effect on the seismic response parameters considered, whereas the global structural response is highly affected by the ground motion uncertainty. Also, some uncertainty in soil-pile property such as soil-pile friction capacity has a significant impact on the response parameters and should be carefully modeled. Based on the results, it is highlighted that which uncertain parameters should be considered carefully and which can be assumed with reasonable engineering judgment during the early structural design stage of fixed steel jacket platforms.

Keywords: fixed jacket offshore platform, pile-soil structure interaction, sensitivity analysis

Procedia PDF Downloads 345
12535 Using Machine Learning to Enhance Win Ratio for College Ice Hockey Teams

Authors: Sadixa Sanjel, Ahmed Sadek, Naseef Mansoor, Zelalem Denekew

Abstract:

Collegiate ice hockey (NCAA) sports analytics is different from the national level hockey (NHL). We apply and compare multiple machine learning models such as Linear Regression, Random Forest, and Neural Networks to predict the win ratio for a team based on their statistics. Data exploration helps determine which statistics are most useful in increasing the win ratio, which would be beneficial to coaches and team managers. We ran experiments to select the best model and chose Random Forest as the best performing. We conclude with how to bridge the gap between the college and national levels of sports analytics and the use of machine learning to enhance team performance despite not having a lot of metrics or budget for automatic tracking.

Keywords: NCAA, NHL, sports analytics, random forest, regression, neural networks, game predictions

Procedia PDF Downloads 81
12534 Dynamic Process Monitoring of an Ammonia Synthesis Fixed-Bed Reactor

Authors: Bothinah Altaf, Gary Montague, Elaine B. Martin

Abstract:

This study involves the modeling and monitoring of an ammonia synthesis fixed-bed reactor using partial least squares (PLS) and its variants. The process exhibits complex dynamic behavior due to the presence of heat recycling and feed quench. One limitation of static PLS model in this situation is that it does not take account of the process dynamics and hence dynamic PLS was used. Although it showed, superior performance to static PLS in terms of prediction, the monitoring scheme was inappropriate hence adaptive PLS was considered. A limitation of adaptive PLS is that non-conforming observations also contribute to the model, therefore, a new adaptive approach was developed, robust adaptive dynamic PLS. This approach updates a dynamic PLS model and is robust to non-representative data. The developed methodology showed a clear improvement over existing approaches in terms of the modeling of the reactor and the detection of faults.

Keywords: ammonia synthesis fixed-bed reactor, dynamic partial least squares modeling, recursive partial least squares, robust modeling

Procedia PDF Downloads 365
12533 Optimization of Machine Learning Regression Results: An Application on Health Expenditures

Authors: Songul Cinaroglu

Abstract:

Machine learning regression methods are recommended as an alternative to classical regression methods in the existence of variables which are difficult to model. Data for health expenditure is typically non-normal and have a heavily skewed distribution. This study aims to compare machine learning regression methods by hyperparameter tuning to predict health expenditure per capita. A multiple regression model was conducted and performance results of Lasso Regression, Random Forest Regression and Support Vector Machine Regression recorded when different hyperparameters are assigned. Lambda (λ) value for Lasso Regression, number of trees for Random Forest Regression, epsilon (ε) value for Support Vector Regression was determined as hyperparameters. Study results performed by using 'k' fold cross validation changed from 5 to 50, indicate the difference between machine learning regression results in terms of R², RMSE and MAE values that are statistically significant (p < 0.001). Study results reveal that Random Forest Regression (R² ˃ 0.7500, RMSE ≤ 0.6000 ve MAE ≤ 0.4000) outperforms other machine learning regression methods. It is highly advisable to use machine learning regression methods for modelling health expenditures.

Keywords: machine learning, lasso regression, random forest regression, support vector regression, hyperparameter tuning, health expenditure

Procedia PDF Downloads 189
12532 Daily Variations of Particulate Matter (PM10) in Industrial Sites in an Suburban Area of Sour El Ghozlane, Algeria

Authors: Sidali Khedidji, Riad Ladji, Noureddine Yassaa

Abstract:

In this study, particulate matter (PM10) which are hazardous for environment and human health were investigated in Sour El Ghozlane suburban atmosphere at a sampling point from March 2013 to April 2013. Ambient concentration measurements of polycyclic aromatic hydrocarbons were carried out at a regional study of the cement industry in Sour El Ghozlane. During sampling, the airborne particulate matter was enriched onto PTFE filters by using a two medium volume samplers with or without a size-selective inlet for PM10 and TSP were used and each sampling period lasted approximately 24 h. The organic compounds were characterized using gas chromatography coupled with mass spectrometric detection (GC-MSD). Total concentrations for PAHs recorded in sour el ghozlane suburban ranged from 101 to 204 ng m-3. Gravimeter method was applied to the black smoke concentration data for Springer seasons. The 24 h average concentrations of PM10 and TSP of Sour El Ghozlane suburban atmosphere were found in the range 4.76–165.76 μg/m3 and 28.63–800.14 μg/m3, respectively, in the sampling period. Meteorological factors, such as (relative humidity and temperature) were typically found to be affecting PMs, especially PM10. Air temperature did not seem to be significantly affecting TSP and PM10 mass concentrations.The guide value fixed by the European Community «40 μg/m3» not to exceed 35 days, were exceeded in some samples. However, it should be noted that the value limit fixed by the Algerian regulations «80 μg/m3» has been exceeded in 3 samplers during the period study.

Keywords: PAHs, PM10, TSP, particulate matter, cement industry

Procedia PDF Downloads 351
12531 Application of Principle Component Analysis for Classification of Random Doppler-Radar Targets during the Surveillance Operations

Authors: G. C. Tikkiwal, Mukesh Upadhyay

Abstract:

During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving army, moving convoys etc. The Radar operator selects one of the promising targets into Single Target Tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper we present a technique using mathematical and statistical methods like Fast Fourier Transformation (FFT) and Principal Component Analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.

Keywords: radar target, fft, principal component analysis, eigenvector, octave-notes, dsp

Procedia PDF Downloads 320