Search results for: receiver operator curve (ROC)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1660

Search results for: receiver operator curve (ROC)

1510 Split Monotone Inclusion and Fixed Point Problems in Real Hilbert Spaces

Authors: Francis O. Nwawuru

Abstract:

The convergence analysis of split monotone inclusion problems and fixed point problems of certain nonlinear mappings are investigated in the setting of real Hilbert spaces. Inertial extrapolation term in the spirit of Polyak is incorporated to speed up the rate of convergence. Under standard assumptions, a strong convergence of the proposed algorithm is established without computing the resolvent operator or involving Yosida approximation method. The stepsize involved in the algorithm does not depend on the spectral radius of the linear operator. Furthermore, applications of the proposed algorithm in solving some related optimization problems are also considered. Our result complements and extends numerous results in the literature.

Keywords: fixedpoint, hilbertspace, monotonemapping, resolventoperators

Procedia PDF Downloads 17
1509 Determining G-γ Degradation Curve in Cohesive Soils by Dilatometer and in situ Seismic Tests

Authors: Ivandic Kreso, Spiranec Miljenko, Kavur Boris, Strelec Stjepan

Abstract:

This article discusses the possibility of using dilatometer tests (DMT) together with in situ seismic tests (MASW) in order to get the shape of G-g degradation curve in cohesive soils (clay, silty clay, silt, clayey silt and sandy silt). MASW test provides the small soil stiffness (Go from vs) at very small strains and DMT provides the stiffness of the soil at ‘work strains’ (MDMT). At different test locations, dilatometer shear stiffness of the soil has been determined by the theory of elasticity. Dilatometer shear stiffness has been compared with the theoretical G-g degradation curve in order to determine the typical range of shear deformation for different types of cohesive soil. The analysis also includes factors that influence the shape of the degradation curve (G-g) and dilatometer modulus (MDMT), such as the overconsolidation ratio (OCR), plasticity index (IP) and the vertical effective stress in the soil (svo'). Parametric study in this article defines the range of shear strain gDMT and GDMT/Go relation depending on the classification of a cohesive soil (clay, silty clay, clayey silt, silt and sandy silt), function of density (loose, medium dense and dense) and the stiffness of the soil (soft, medium hard and hard). The article illustrates the potential of using MASW and DMT to obtain G-g degradation curve in cohesive soils.

Keywords: dilatometer testing, MASW testing, shear wave, soil stiffness, stiffness reduction, shear strain

Procedia PDF Downloads 279
1508 Evaluation of Hepatic Metabolite Changes for Differentiation Between Non-Alcoholic Steatohepatitis and Simple Hepatic Steatosis Using Long Echo-Time Proton Magnetic Resonance Spectroscopy

Authors: Tae-Hoon Kim, Kwon-Ha Yoon, Hong Young Jun, Ki-Jong Kim, Young Hwan Lee, Myeung Su Lee, Keum Ha Choi, Ki Jung Yun, Eun Young Cho, Yong-Yeon Jeong, Chung-Hwan Jun

Abstract:

Purpose: To assess the changes of hepatic metabolite for differentiation between non-alcoholic steatohepatitis (NASH) and simple steatosis on proton magnetic resonance spectroscopy (1H-MRS) in both humans and animal model. Methods: The local institutional review board approved this study and subjects gave written informed consent. 1H-MRS measurements were performed on a localized voxel of the liver using a point-resolved spectroscopy (PRESS) sequence and hepatic metabolites of alanine (Ala), lactate/triglyceride (Lac/TG), and TG were analyzed in NASH, simple steatosis and control groups. The group difference was tested with the ANOVA and Tukey’s post-hoc tests, and diagnostic accuracy was tested by calculating the area under the receiver operating characteristics (ROC) curve. The associations between metabolic concentration and pathologic grades or non-alcoholic fatty liver disease(NAFLD) activity scores were assessed by the Pearson’s correlation. Results: Patient with NASH showed the elevated Ala(p<0.001), Lac/TG(p < 0.001), TG(p < 0.05) concentration when compared with patients who had simple steatosis and healthy controls. The NASH patients were higher levels in Ala(mean±SEM, 52.5±8.3 vs 2.0±0.9; p < 0.001), Lac/TG(824.0±168.2 vs 394.1±89.8; p < 0.05) than simple steatosis. The area under the ROC curve to distinguish NASH from simple steatosis was 1.00 (95% confidence interval; 1.00, 1.00) with Ala and 0.782 (95% confidence interval; 0.61, 0.96) with Lac/TG. The Ala and Lac/TG levels were well correlated with steatosis grade, lobular inflammation, and NAFLD activity scores. The metabolic changes in human were reproducible to a mice model induced by streptozotocin injection and a high-fat diet. Conclusion: 1H-MRS would be useful for differentiation of patients with NASH and simple hepatic steatosis.

Keywords: non-alcoholic fatty liver disease, non-alcoholic steatohepatitis, 1H MR spectroscopy, hepatic metabolites

Procedia PDF Downloads 300
1507 Runoff Estimation Using NRCS-CN Method

Authors: E. K. Naseela, B. M. Dodamani, Chaithra Chandran

Abstract:

The GIS and remote sensing techniques facilitate accurate estimation of surface runoff from watershed. In the present study an attempt has been made to evaluate the applicability of Natural Resources Service Curve Number method using GIS and Remote sensing technique in the upper Krishna basin (69,425 Sq.km). Landsat 7 (with resolution 30 m) satellite data for the year 2012 has been used for the preparation of land use land cover (LU/LC) map. The hydrologic soil group is mapped using GIS platform. The weighted curve numbers (CN) for all the 5 subcatchments calculated on the basis of LU/LC type and hydrologic soil class in the area by considering antecedent moisture condition. Monthly rainfall data was available for 58 raingauge stations. Overlay technique is adopted for generating weighted curve number. Results of the study show that land use changes determined from satellite images are useful in studying the runoff response of the basin. The results showed that there is no significant difference between observed and estimated runoff depths. For each subcatchment, statistically positive correlations were detected between observed and estimated runoff depth (0.6Keywords: curve number, GIS, remote sensing, runoff

Procedia PDF Downloads 514
1506 GIS Application in Surface Runoff Estimation for Upper Klang River Basin, Malaysia

Authors: Suzana Ramli, Wardah Tahir

Abstract:

Estimation of surface runoff depth is a vital part in any rainfall-runoff modeling. It leads to stream flow calculation and later predicts flood occurrences. GIS (Geographic Information System) is an advanced and opposite tool used in simulating hydrological model due to its realistic application on topography. The paper discusses on calculation of surface runoff depth for two selected events by using GIS with Curve Number method for Upper Klang River basin. GIS enables maps intersection between soil type and land use that later produces curve number map. The results show good correlation between simulated and observed values with more than 0.7 of R2. Acceptable performance of statistical measurements namely mean error, absolute mean error, RMSE, and bias are also deduced in the paper.

Keywords: surface runoff, geographic information system, curve number method, environment

Procedia PDF Downloads 247
1505 Comparison of Reserve Strength Ratio and Capacity Curve Parameters of Offshore Platforms with Distinct Bracing Arrangements

Authors: Aran Dezhban, Hooshang Dolatshahi Pirooz

Abstract:

The phenomenon of corrosion, especially in the Persian Gulf region, is the main cause of the deterioration of offshore platforms, due to the high corrosion of its water. This phenomenon occurs mostly in the area of water spraying, threatening the members of the first floor of the jacket, legs, and piles in this area. In the current study, the effect of bracing arrangement on the Capacity Curve and Reserve Strength Ratio of Fixed-Type Offshore Platforms is investigated. In order to continue the operation of the platform, two modes of robust and damaged structures are considered, while checking the adequacy of the platform capacity based on the allowable values of API RP-2SIM regulations. The platform in question is located in the Persian Gulf, which is modeled on the OpenSEES software. In this research, the Nonlinear Pushover Analysis has been used. After validation, the Capacity Curve of the studied platforms is obtained and then their Reserve Strength Ratio is calculated. Results are compared with the criteria in the API-2SIM regulations.

Keywords: fixed-type jacket structure, structural integrity management, nonlinear pushover analysis, robust and damaged structure, reserve strength ration, capacity curve

Procedia PDF Downloads 74
1504 A Secure Proxy Signature Scheme with Fault Tolerance Based on RSA System

Authors: H. El-Kamchouchi, Heba Gaber, Fatma Ahmed, Dalia H. El-Kamchouchi

Abstract:

Due to the rapid growth in modern communication systems, fault tolerance and data security are two important issues in a secure transaction. During the transmission of data between the sender and receiver, errors may occur frequently. Therefore, the sender must re-transmit the data to the receiver in order to correct these errors, which makes the system very feeble. To improve the scalability of the scheme, we present a secure proxy signature scheme with fault tolerance over an efficient and secure authenticated key agreement protocol based on RSA system. Authenticated key agreement protocols have an important role in building a secure communications network between the two parties.

Keywords: proxy signature, fault tolerance, rsa, key agreement protocol

Procedia PDF Downloads 255
1503 Hardy Type Inequalities of Two-Dimensional on Time Scales via Steklov Operator

Authors: Wedad Albalawi

Abstract:

The mathematical inequalities have been the core of mathematical study and used in almost all branches of mathematics as well in various areas of science and engineering. The inequalities by Hardy, Littlewood and Polya were the first significant composition of several science. This work presents fundamental ideas, results and techniques and it has had much influence on research in various branches of analysis. Since 1934, various inequalities have been produced and studied in the literature. Furthermore, some inequalities have been formulated by some operators; in 1989, weighted Hardy inequalities have been obtained for integration operators. Then, they obtained weighted estimates for Steklov operators that were used in the solution of the Cauchy problem for the wave equation. They were improved upon in 2011 to include the boundedness of integral operators from the weighted Sobolev space to the weighted Lebesgue space. Some inequalities have been demonstrated and improved using the Hardy–Steklov operator. Recently, a lot of integral inequalities have been improved by differential operators. Hardy inequality has been one of the tools that is used to consider integrity solutions of differential equations. Then dynamic inequalities of Hardy and Coposon have been extended and improved by various integral operators. These inequalities would be interesting to apply in different fields of mathematics (functional spaces, partial differential equations, mathematical modeling). Some inequalities have been appeared involving Copson and Hardy inequalities on time scales to obtain new special version of them. A time scale is defined as a closed subset contains real numbers. Then the inequalities of time scales version have received a lot of attention and has had a major field in both pure and applied mathematics. There are many applications of dynamic equations on time scales to quantum mechanics, electrical engineering, neural networks, heat transfer, combinatorics, and population dynamics. This study focuses on double integrals to obtain new time-scale inequalities of Copson driven by Steklov operator. They will be applied in the solution of the Cauchy problem for the wave equation. The proof can be done by introducing restriction on the operator in several cases. In addition, the obtained inequalities done by using some concepts in time scale version such as time scales calculus, theorem of Fubini and the inequality of H¨older.

Keywords: time scales, inequality of Hardy, inequality of Coposon, Steklov operator

Procedia PDF Downloads 43
1502 Applying Sliding Autonomy for a Human-Robot Team on USARSim

Authors: Fang Tang, Jacob Longazo

Abstract:

This paper describes a sliding autonomy approach for coordinating a team of robots to assist the human operator to accomplish tasks while adapting to new or unexpected situations by requesting help from the human operator. While sliding autonomy has been well studied in the context of controlling a single robot. Much work needs to be done to apply sliding autonomy to a multi-robot team, especially human-robot team. Our approach aims at a hierarchical sliding control structure, with components that support human-robot collaboration. We validated our approach in the USARSim simulation and demonstrated that the human-robot team's overall performance can be improved under the sliding autonomy control.

Keywords: sliding autonomy, multi-robot team, human-robot collaboration, USARSim

Procedia PDF Downloads 500
1501 Mixed Number Algebra and Its Application

Authors: Md. Shah Alam

Abstract:

Mushfiq Ahmad has defined a Mixed Number, which is the sum of a scalar and a Cartesian vector. He has also defined the elementary group operations of Mixed numbers i.e. the norm of Mixed numbers, the product of two Mixed numbers, the identity element and the inverse. It has been observed that Mixed Number is consistent with Pauli matrix algebra and a handy tool to work with Dirac electron theory. Its use as a mathematical method in Physics has been studied. (1) We have applied Mixed number in Quantum Mechanics: Mixed Number version of Displacement operator, Vector differential operator, and Angular momentum operator has been developed. Mixed Number method has also been applied to Klein-Gordon equation. (2) We have applied Mixed number in Electrodynamics: Mixed Number version of Maxwell’s equation, the Electric and Magnetic field quantities and Lorentz Force has been found. (3) An associative transformation of Mixed Number numbers fulfilling Lorentz invariance requirement is developed. (4) We have applied Mixed number algebra as an extension of Complex number. Mixed numbers and the Quaternions have isomorphic correspondence, but they are different in algebraic details. The multiplication of unit Mixed number and the multiplication of unit Quaternions are different. Since Mixed Number has properties similar to those of Pauli matrix algebra, Mixed Number algebra is a more convenient tool to deal with Dirac equation.

Keywords: mixed number, special relativity, quantum mechanics, electrodynamics, pauli matrix

Procedia PDF Downloads 330
1500 A Hybrid Based Algorithm to Solve the Multi-objective Minimum Spanning Tree Problem

Authors: Boumesbah Asma, Chergui Mohamed El-amine

Abstract:

Since it has been shown that the multi-objective minimum spanning tree problem (MOST) is NP-hard even with two criteria, we propose in this study a hybrid NSGA-II algorithm with an exact mutation operator, which is only used with low probability, to find an approximation to the Pareto front of the problem. In a connected graph G, a spanning tree T of G being a connected and cycle-free graph, if k edges of G\T are added to T, we obtain a partial graph H of G inducing a reduced size multi-objective spanning tree problem compared to the initial one. With a weak probability for the mutation operator, an exact method for solving the reduced MOST problem considering the graph H is then used to give birth to several mutated solutions from a spanning tree T. Then, the selection operator of NSGA-II is activated to obtain the Pareto front approximation. Finally, an adaptation of the VNS metaheuristic is called for further improvements on this front. It allows finding good individuals to counterbalance the diversification and the intensification during the optimization search process. Experimental comparison studies with an exact method show promising results and indicate that the proposed algorithm is efficient.

Keywords: minimum spanning tree, multiple objective linear optimization, combinatorial optimization, non-sorting genetic algorithm, variable neighborhood search

Procedia PDF Downloads 64
1499 Strap Tension Adjusting Device for Non-Invasive Positive Pressure Ventilation Mask Fitting

Authors: Yoshie Asahara, Hidekuni Takao

Abstract:

Non-invasive positive pressure ventilation (NPPV), a type of ventilation therapy, is a treatment in which a mask is attached to the patient's face and delivers gas into the mask to support breathing. The NPPV mask uses a strap, which is necessary to attach and secure the mask in the appropriate facial position, but the tensile strength of the strap is adjusted by the sensation of the hands. The strap uniformity and fine-tuning strap tension are judged by the skill of the operator and the amount felt by the finger. In the future, additional strap operation and adjustment methods will be required to meet the needs for reducing the burden on the patient’s face. In this study, we fabricated a mechanism that can measure, adjust and fix the tension of the straps. A small amount of strap tension can be adjusted by rotating the shaft. This makes it possible to control the slight strap tension that is difficult to grasp with the sense of the operator's hand. In addition, this mechanism allows the operator to control the strap while controlling the movement of the mask body. This leads to the establishment of a suitable mask fitting method for each patient. The developed mechanism enables the operation and fine reproducible adjustment of the strap tension and the mask balance, reducing the burden on the face.

Keywords: balance of the mask strap, fine adjustment, film sensor, mask fitting technique, mask strap tension

Procedia PDF Downloads 202
1498 Soil Parameters Identification around PMT Test by Inverse Analysis

Authors: I. Toumi, Y. Abed, A. Bouafia

Abstract:

This paper presents a methodology for identifying the cohesive soil parameters that takes into account different constitutive equations. The procedure, applied to identify the parameters of generalized Prager model associated to the Drucker & Prager failure criterion from a pressuremeter expansion curve, is based on an inverse analysis approach, which consists of minimizing the function representing the difference between the experimental curve and the simulated curve using a simplex algorithm. The model response on pressuremeter path and its identification from experimental data lead to the determination of the friction angle, the cohesion and the Young modulus. Some parameters effects on the simulated curves and stresses path around pressuremeter probe are presented. Comparisons between the parameters determined with the proposed method and those obtained by other means are also presented.

Keywords: cohesive soils, cavity expansion, pressuremeter test, finite element method, optimization procedure, simplex algorithm

Procedia PDF Downloads 266
1497 Effect of Fuel Injection Discharge Curve and Injection Pressure on Upgrading Power and Combustion Parameters in HD Diesel Engine with CFD Simulation

Authors: Saeed Chamehsara, Seyed Mostafa Mirsalim, Mehdi Tajdari

Abstract:

In this study, the effect of fuel injection discharge curve and injection pressure simultaneously for upgrading power of heavy duty diesel engine by simulation of combustion process in AVL-Fire software are discussed. Hence, the fuel injection discharge curve was changed from semi-triangular to rectangular which is usual in common rail fuel injection system. Injection pressure with respect to amount of injected fuel and nozzle hole diameter are changed. Injection pressure is calculated by an experimental equation which is for heavy duty diesel engines with common rail fuel injection system. Upgrading power for 1000 and 2000 bar injection pressure are discussed. For 1000 bar injection pressure with 188 mg injected fuel and 3 mm nozzle hole diameter in compare with first state which is semi-triangular discharge curve with 139 mg injected fuel and 3 mm nozzle hole diameter, upgrading power is about 19% whereas the special change has not been observed in cylinder pressure. On the other hand, both the NOX emission and the Soot emission decreased about 30% and 6% respectively. Compared with first state, for 2000 bar injection pressure that injected fuel and nozzle diameter are 196 mg and 2.6 mm respectively, upgrading power is about 22% whereas cylinder pressure has been fixed and NOX emission and the Soot emissions are decreased 36% and 20%, respectively.

Keywords: CFD simulation, HD diesel engine, upgrading power, injection pressure, fuel injection discharge curve, combustion process

Procedia PDF Downloads 488
1496 Matrix Valued Difference Equations with Spectral Singularities

Authors: Serifenur Cebesoy, Yelda Aygar, Elgiz Bairamov

Abstract:

In this study, we examine some spectral properties of non-selfadjoint matrix-valued difference equations consisting of a polynomial type Jost solution. The aim of this study is to investigate the eigenvalues and spectral singularities of the difference operator L which is expressed by the above-mentioned difference equation. Firstly, thanks to the representation of polynomial type Jost solution of this equation, we obtain asymptotics and some analytical properties. Then, using the uniqueness theorems of analytic functions, we guarantee that the operator L has a finite number of eigenvalues and spectral singularities.

Keywords: asymptotics, continuous spectrum, difference equations, eigenvalues, jost functions, spectral singularities

Procedia PDF Downloads 421
1495 Fingerprint Image Encryption Using a 2D Chaotic Map and Elliptic Curve Cryptography

Authors: D. M. S. Bandara, Yunqi Lei, Ye Luo

Abstract:

Fingerprints are suitable as long-term markers of human identity since they provide detailed and unique individual features which are difficult to alter and durable over life time. In this paper, we propose an algorithm to encrypt and decrypt fingerprint images by using a specially designed Elliptic Curve Cryptography (ECC) procedure based on block ciphers. In addition, to increase the confusing effect of fingerprint encryption, we also utilize a chaotic-behaved method called Arnold Cat Map (ACM) for a 2D scrambling of pixel locations in our method. Experimental results are carried out with various types of efficiency and security analyses. As a result, we demonstrate that the proposed fingerprint encryption/decryption algorithm is advantageous in several different aspects including efficiency, security and flexibility. In particular, using this algorithm, we achieve a margin of about 0.1% in the test of Number of Pixel Changing Rate (NPCR) values comparing to the-state-of-the-art performances.

Keywords: arnold cat map, biometric encryption, block cipher, elliptic curve cryptography, fingerprint encryption, Koblitz’s encoding

Procedia PDF Downloads 173
1494 Mitigation of Interference in Satellite Communications Systems via a Cross-Layer Coding Technique

Authors: Mario A. Blanco, Nicholas Burkhardt

Abstract:

An important problem in satellite communication systems which operate in the Ka and EHF frequency bands consists of the overall degradation in link performance of mobile terminals due to various types of degradations in the link/channel, such as fading, blockage of the link to the satellite (especially in urban environments), intentional as well as other types of interference, etc. In this paper, we focus primarily on the interference problem, and we develop a very efficient and cost-effective solution based on the use of fountain codes. We first introduce a satellite communications (SATCOM) terminal uplink interference channel model that is classically used against communication systems that use spread-spectrum waveforms. We then consider the use of fountain codes, with focus on Raptor codes, as our main mitigation technique to combat the degradation in link/receiver performance due to the interference signal. The performance of the receiver is obtained in terms of average probability of bit and message error rate as a function of bit energy-to-noise density ratio, Eb/N0, and other parameters of interest, via a combination of analysis and computer simulations, and we show that the use of fountain codes is extremely effective in overcoming the effects of intentional interference on the performance of the receiver and associated communication links. We then show this technique can be extended to mitigate other types of SATCOM channel degradations, such as those caused by channel fading, shadowing, and hard-blockage of the uplink signal.

Keywords: SATCOM, interference mitigation, fountain codes, turbo codes, cross-layer

Procedia PDF Downloads 328
1493 The Comparative Study of the Characteristics of Chinese and Foreign Excellent Woman’s Single Players’ Serve, Receive Tactic Author

Authors: Zhai Yuan, Wu Xueqing

Abstract:

This article statistics the technology which used by Chinese and foreign excellent players in the game, including types and serves areas,receive technology and effect and utilization ratio receiving and losing points. The sample is che videos which is world's top matches of excellent badminton athletes of che single, including Chinese players’ 43 games and foreign players’ 38 games. Conclusion: For the serving, Chinese and foreign single players are to give priority to forehand short-low serve and the long-high serve. And Chinese and foreign players in using forehand short-low serve and drive server exist significant differences; For the serves areas, Chinese and foreign players serve area is concentrated in area 1,5,6. Area 6 has the highest rate of all the district areas, following by the area 1and area 5. Among the 2ed serve area Sino-foreign player, there exist significant differences; In the receiver, when returning the frontcourt shutter, players is given priority to net lift and push. When returning the backcourt shutter, receiver's the best ball is smash, followed by clear and drop shot. Foreign players have higher utilization rate in smash than Chinese players in the backcourt; In the receiver result, Chinese players give priority to actively and equally situation than foreign players, but in negatively receiving is just opposite.

Keywords: badminton, woman’s singles, technique and tactics, comparative analysis

Procedia PDF Downloads 502
1492 The Stage and Cause of Regional Industrial Specialization Evolution in China

Authors: Cheng Wen, Zhang Jianhua

Abstract:

This paper aims to probe into the general rules of industry specialization or diversification in a region during its process of economic growth and the specific reasons for the difference of industry specialization development in the eastern, central and western regions of China. It is found in this paper that the changes of regional industry specialization in China, like most of countries in the world, also present the U-shaped curve. Regional industrial structure is diversified in the first place. And when the per capita income exceeds a certain level, distribution of economic resources in this region will be concentrated again. From the perspective of rising total factor productivity and falling of transaction cost in the process of economic development, this paper comes up with a theoretical model to explain the U-shaped curve. Through the empirical test of China's provincial panel data, this paper explains the factors that cause the inequality of the industry specialization development in the eastern, central and western regions of China.

Keywords: u-shaped curve, regional industrial specialization, technological progress, transaction costs

Procedia PDF Downloads 282
1491 Optimized and Secured Digital Watermarking Using Fuzzy Entropy, Bezier Curve and Visual Cryptography

Authors: R. Rama Kishore, Sunesh

Abstract:

Recent development in the usage of internet for different purposes creates a great threat for the copyright protection of the digital images. Digital watermarking can be used to address the problem. This paper presents detailed review of the different watermarking techniques, latest trends in the field of secured, robust and imperceptible watermarking. It also discusses the different optimization techniques used in the field of watermarking in order to improve the robustness and imperceptibility of the method. Different measures are discussed to evaluate the performance of the watermarking algorithm. At the end, this paper proposes a watermarking algorithm using (2, 2) share visual cryptography and Bezier curve based algorithm to improve the security of the watermark. The proposed method uses fractional transformation to improve the robustness of the copyright protection of the method. The algorithm is optimized using fuzzy entropy for better results.

Keywords: digital watermarking, fractional transform, visual cryptography, Bezier curve, fuzzy entropy

Procedia PDF Downloads 335
1490 1D Klein-Gordon Equation in an Infinite Square Well with PT Symmetry Boundary Conditions

Authors: Suleiman Bashir Adamu, Lawan Sani Taura

Abstract:

We study the role of boundary conditions via -symmetric quantum mechanics, where denotes parity operator and denotes time reversal operator. Using the one-dimensional Schrödinger Hamiltonian for a free particle in an infinite square well, we introduce symmetric boundary conditions. We find solutions of the 1D Klein-Gordon equation for a free particle in an infinite square well with Hermitian boundary and symmetry boundary conditions, where in both cases the energy eigenvalues and eigenfunction, respectively, are obtained.

Keywords: Eigenvalues, Eigenfunction, Hamiltonian, Klein- Gordon equation, PT-symmetric quantum mechanics

Procedia PDF Downloads 349
1489 Phillips Curve Estimation in an Emerging Economy: Evidence from Sub-National Data of Indonesia

Authors: Harry Aginta

Abstract:

Using Phillips curve framework, this paper seeks for new empirical evidence on the relationship between inflation and output in a major emerging economy. By exploiting sub-national data, the contribution of this paper is threefold. First, it resolves the issue of using on-target national inflation rates that potentially causes weakening inflation-output nexus. This is very relevant for Indonesia as its central bank has been adopting inflation targeting framework based on national consumer price index (CPI) inflation. Second, the study tests the relevance of mining sector in output gap estimation. The test for mining sector is important to control for the effects of mining regulation and nominal effects of coal prices on real economic activities. Third, the paper applies panel econometric method by incorporating regional variation that help to improve model estimation. The results from this paper confirm the strong presence of Phillips curve in Indonesia. Positive output gap that reflects excess demand condition gives rise to the inflation rates. In addition, the elasticity of output gap is higher if the mining sector is excluded from output gap estimation. In addition to inflation adaptation, the dynamics of exchange rate and international commodity price are also found to affect inflation significantly. The results are robust to the alternative measurement of output gap

Keywords: Phillips curve, inflation, Indonesia, panel data

Procedia PDF Downloads 96
1488 Cotton Transplantation as a Practice to Escape Infection with Some Soil-Borne Pathogens

Authors: E. M. H. Maggie, M. N. A. Nazmey, M. A. Abdel-Sattar, S. A. Saied

Abstract:

A successful trial of transplanting cotton is reported. Seeds grown in trays for 4-5 weeks in an easily prepared supporting medium such as peat moss or similar plant waste are tried. Careful transplanting of seedlings, with root system as intact as possible, is being made in the permanent field. The practice reduced damping-off incidence rate and allowed full winter crop revenues. Further work is needed to evaluate certain parameters such as growth curve, flowering curve, and yield at economic bases.

Keywords: cotton, transplanting cotton, damping-off diseases, environment sciences

Procedia PDF Downloads 335
1487 Corneal Confocal Microscopy As a Surrogate Marker of Neuronal Pathology In Schizophrenia

Authors: Peter W. Woodruff, Georgios Ponirakis, Reem Ibrahim, Amani Ahmed, Hoda Gad, Ioannis N. Petropoulos, Adnan Khan, Ahmed Elsotouhy, Surjith Vattoth, Mahmoud K. M. Alshawwaf, Mohamed Adil Shah Khoodoruth, Marwan Ramadan, Anjushri Bhagat, James Currie, Ziyad Mahfoud, Hanadi Al Hamad, Ahmed Own, Peter Haddad, Majid Alabdulla, Rayaz A. Malik

Abstract:

Introduction:- We aimed to test the hypothesis that, using corneal confocal microscopy (a non-invasive method for assessing corneal nerve fibre integrity), patients with schizophrenia would show neuronal abnormalities compared with healthy participants. Schizophrenia is a neurodevelopmental and progressive neurodegenerative disease, for which there are no validated biomarkers. Corneal confocal microscopy (CCM) is a non-invasive ophthalmic imaging biomarker that can be used to detect neuronal abnormalities in neuropsychiatric syndromes. Methods:- Patients with schizophrenia (DSM-V criteria) without other causes of peripheral neuropathy and healthy controls underwent CCM, vibration perception threshold (VPT) and sudomotor function testing. The diagnostic accuracy of CCM in distinguishing patients from controls was assessed using the area under the curve (AUC) of the Receiver Operating Characterstics (ROC) curve. Findings:- Participants with schizophrenia (n=17) and controls (n=38) with comparable age (35.7±8.5 vs 35.6±12.2, P=0.96) were recruited. Patients with schizophrenia had significantly higher body weight (93.9±25.5 vs 77.1±10.1, P=0.02), lower Low Density Lipoproteins (2.6±1.0 vs 3.4±0.7, P=0.02), but comparable systolic and diastolic blood pressure, HbA1c, total cholesterol, triglycerides and High Density Lipoproteins were comparable with control participants. Patients with schizophrenia had significantly lower corneal nerve fiber density (CNFD, fibers/mm2) (23.5±7.8 vs 35.6±6.5, p<0.0001), branch density (CNBD, branches/mm2) (34.4±26.9 vs 98.1±30.6, p<0.0001), and fiber length (CNFL, mm/mm2) (14.3±4.7 vs 24.2±3.9, p<0.0001) but no difference in VPT (6.1±3.1 vs 4.5±2.8, p=0.12) and electrochemical skin conductance (61.0±24.0 vs 68.9±12.3, p=0.23) compared with controls. The diagnostic accuracy of CNFD, CNBD and CNFL to distinguish patients with schizophrenia from healthy controls were, according to the AUC, (95% CI): 87.0% (76.8-98.2), 93.2% (84.2-102.3), 93.2% (84.4-102.1), respectively. Conclusion:- In conclusion, CCM can be used to help identify neuronal changes and has a high diagnostic accuracy to distinguish subjects with schizophrenia from healthy controls.

Keywords:

Procedia PDF Downloads 240
1486 Fuzzy-Machine Learning Models for the Prediction of Fire Outbreak: A Comparative Analysis

Authors: Uduak Umoh, Imo Eyoh, Emmauel Nyoho

Abstract:

This paper compares fuzzy-machine learning algorithms such as Support Vector Machine (SVM), and K-Nearest Neighbor (KNN) for the predicting cases of fire outbreak. The paper uses the fire outbreak dataset with three features (Temperature, Smoke, and Flame). The data is pre-processed using Interval Type-2 Fuzzy Logic (IT2FL) algorithm. Min-Max Normalization and Principal Component Analysis (PCA) are used to predict feature labels in the dataset, normalize the dataset, and select relevant features respectively. The output of the pre-processing is a dataset with two principal components (PC1 and PC2). The pre-processed dataset is then used in the training of the aforementioned machine learning models. K-fold (with K=10) cross-validation method is used to evaluate the performance of the models using the matrices – ROC (Receiver Operating Curve), Specificity, and Sensitivity. The model is also tested with 20% of the dataset. The validation result shows KNN is the better model for fire outbreak detection with an ROC value of 0.99878, followed by SVM with an ROC value of 0.99753.

Keywords: Machine Learning Algorithms , Interval Type-2 Fuzzy Logic, Fire Outbreak, Support Vector Machine, K-Nearest Neighbour, Principal Component Analysis

Procedia PDF Downloads 139
1485 Proximal Method of Solving Split System of Minimization Problem

Authors: Anteneh Getachew Gebrie, Rabian Wangkeeree

Abstract:

The purpose of this paper is to introduce iterative algorithm solving split system of minimization problem given as a task of finding a common minimizer point of finite family of proper, lower semicontinuous convex functions and whose image under a bounded linear operator is also common minimizer point of another finite family of proper, lower semicontinuous convex functions. We obtain strong convergence of the sequence generated by our algorithm under some suitable conditions on the parameters. The iterative schemes are developed with a way of selecting the step sizes such that the information of operator norm is not necessary. Some applications and numerical experiment is given to analyse the efficiency of our algorithm.

Keywords: Hilbert Space, minimization problems, Moreau-Yosida approximate, split feasibility problem

Procedia PDF Downloads 108
1484 Development and Validation of a Coronary Heart Disease Risk Score in Indian Type 2 Diabetes Mellitus Patients

Authors: Faiz N. K. Yusufi, Aquil Ahmed, Jamal Ahmad

Abstract:

Diabetes in India is growing at an alarming rate and the complications caused by it need to be controlled. Coronary heart disease (CHD) is one of the complications that will be discussed for prediction in this study. India has the second most number of diabetes patients in the world. To the best of our knowledge, there is no CHD risk score for Indian type 2 diabetes patients. Any form of CHD has been taken as the event of interest. A sample of 750 was determined and randomly collected from the Rajiv Gandhi Centre for Diabetes and Endocrinology, J.N.M.C., A.M.U., Aligarh, India. Collected variables include patients data such as sex, age, height, weight, body mass index (BMI), blood sugar fasting (BSF), post prandial sugar (PP), glycosylated haemoglobin (HbA1c), diastolic blood pressure (DBP), systolic blood pressure (SBP), smoking, alcohol habits, total cholesterol (TC), triglycerides (TG), high density lipoprotein (HDL), low density lipoprotein (LDL), very low density lipoprotein (VLDL), physical activity, duration of diabetes, diet control, history of antihypertensive drug treatment, family history of diabetes, waist circumference, hip circumference, medications, central obesity and history of CHD. Predictive risk scores of CHD events are designed by cox proportional hazard regression. Model calibration and discrimination is assessed from Hosmer Lemeshow and area under receiver operating characteristic (ROC) curve. Overfitting and underfitting of the model is checked by applying regularization techniques and best method is selected between ridge, lasso and elastic net regression. Youden’s index is used to choose the optimal cut off point from the scores. Five year probability of CHD is predicted by both survival function and Markov chain two state model and the better technique is concluded. The risk scores for CHD developed can be calculated by doctors and patients for self-control of diabetes. Furthermore, the five-year probabilities can be implemented as well to forecast and maintain the condition of patients.

Keywords: coronary heart disease, cox proportional hazard regression, ROC curve, type 2 diabetes Mellitus

Procedia PDF Downloads 189
1483 A Deletion-Cost Based Fast Compression Algorithm for Linear Vector Data

Authors: Qiuxiao Chen, Yan Hou, Ning Wu

Abstract:

As there are deficiencies of the classic Douglas-Peucker Algorithm (DPA), such as high risks of deleting key nodes by mistake, high complexity, time consumption and relatively slow execution speed, a new Deletion-Cost Based Compression Algorithm (DCA) for linear vector data was proposed. For each curve — the basic element of linear vector data, all the deletion costs of its middle nodes were calculated, and the minimum deletion cost was compared with the pre-defined threshold. If the former was greater than or equal to the latter, all remaining nodes were reserved and the curve’s compression process was finished. Otherwise, the node with the minimal deletion cost was deleted, its two neighbors' deletion costs were updated, and the same loop on the compressed curve was repeated till the termination. By several comparative experiments using different types of linear vector data, the comparison between DPA and DCA was performed from the aspects of compression quality and computing efficiency. Experiment results showed that DCA outperformed DPA in compression accuracy and execution efficiency as well.

Keywords: Douglas-Peucker algorithm, linear vector data, compression, deletion cost

Procedia PDF Downloads 209
1482 Selection of Intensity Measure in Probabilistic Seismic Risk Assessment of a Turkish Railway Bridge

Authors: M. F. Yilmaz, B. Ö. Çağlayan

Abstract:

Fragility curve is an effective common used tool to determine the earthquake performance of structural and nonstructural components. Also, it is used to determine the nonlinear behavior of bridges. There are many historical bridges in the Turkish railway network; the earthquake performances of these bridges are needed to be investigated. To derive fragility curve Intensity measures (IMs) and Engineering demand parameters (EDP) are needed to be determined. And the relation between IMs and EDP are needed to be derived. In this study, a typical simply supported steel girder riveted railway bridge is studied. Fragility curves of this bridge are derived by two parameters lognormal distribution. Time history analyses are done for selected 60 real earthquake data to determine the relation between IMs and EDP. Moreover, efficiency, practicality, and sufficiency of three different IMs are discussed. PGA, Sa(0.2s) and Sa(1s), the most common used IMs parameters for fragility curve in the literature, are taken into consideration in terms of efficiency, practicality and sufficiency.

Keywords: railway bridges, earthquake performance, fragility analyses, selection of intensity measures

Procedia PDF Downloads 333
1481 Combining an Optimized Closed Principal Curve-Based Method and Evolutionary Neural Network for Ultrasound Prostate Segmentation

Authors: Tao Peng, Jing Zhao, Yanqing Xu, Jing Cai

Abstract:

Due to missing/ambiguous boundaries between the prostate and neighboring structures, the presence of shadow artifacts, as well as the large variability in prostate shapes, ultrasound prostate segmentation is challenging. To handle these issues, this paper develops a hybrid method for ultrasound prostate segmentation by combining an optimized closed principal curve-based method and the evolutionary neural network; the former can fit curves with great curvature and generate a contour composed of line segments connected by sorted vertices, and the latter is used to express an appropriate map function (represented by parameters of evolutionary neural network) for generating the smooth prostate contour to match the ground truth contour. Both qualitative and quantitative experimental results showed that our proposed method obtains accurate and robust performances.

Keywords: ultrasound prostate segmentation, optimized closed polygonal segment method, evolutionary neural network, smooth mathematical model, principal curve

Procedia PDF Downloads 167