Search results for: Random Kernel Density
5294 Analysis of Factors Affecting the Number of Infant and Maternal Mortality in East Java with Geographically Weighted Bivariate Generalized Poisson Regression Method
Authors: Luh Eka Suryani, Purhadi
Abstract:
Poisson regression is a non-linear regression model with response variable in the form of count data that follows Poisson distribution. Modeling for a pair of count data that show high correlation can be analyzed by Poisson Bivariate Regression. Data, the number of infant mortality and maternal mortality, are count data that can be analyzed by Poisson Bivariate Regression. The Poisson regression assumption is an equidispersion where the mean and variance values are equal. However, the actual count data has a variance value which can be greater or less than the mean value (overdispersion and underdispersion). Violations of this assumption can be overcome by applying Generalized Poisson Regression. Characteristics of each regency can affect the number of cases occurred. This issue can be overcome by spatial analysis called geographically weighted regression. This study analyzes the number of infant mortality and maternal mortality based on conditions in East Java in 2016 using Geographically Weighted Bivariate Generalized Poisson Regression (GWBGPR) method. Modeling is done with adaptive bisquare Kernel weighting which produces 3 regency groups based on infant mortality rate and 5 regency groups based on maternal mortality rate. Variables that significantly influence the number of infant and maternal mortality are the percentages of pregnant women visit health workers at least 4 times during pregnancy, pregnant women get Fe3 tablets, obstetric complication handled, clean household and healthy behavior, and married women with the first marriage age under 18 years.Keywords: adaptive bisquare kernel, GWBGPR, infant mortality, maternal mortality, overdispersion
Procedia PDF Downloads 1595293 Machine Learning-Driven Prediction of Cardiovascular Diseases: A Supervised Approach
Authors: Thota Sai Prakash, B. Yaswanth, Jhade Bhuvaneswar, Marreddy Divakar Reddy, Shyam Ji Gupta
Abstract:
Across the globe, there are a lot of chronic diseases, and heart disease stands out as one of the most perilous. Sadly, many lives are lost to this condition, even though early intervention could prevent such tragedies. However, identifying heart disease in its initial stages is not easy. To address this challenge, we propose an automated system aimed at predicting the presence of heart disease using advanced techniques. By doing so, we hope to empower individuals with the knowledge needed to take proactive measures against this potentially fatal illness. Our approach towards this problem involves meticulous data preprocessing and the development of predictive models utilizing classification algorithms such as Support Vector Machines (SVM), Decision Tree, and Random Forest. We assess the efficiency of every model based on metrics like accuracy, ensuring that we select the most reliable option. Additionally, we conduct thorough data analysis to reveal the importance of different attributes. Among the models considered, Random Forest emerges as the standout performer with an accuracy rate of 96.04% in our study.Keywords: support vector machines, decision tree, random forest
Procedia PDF Downloads 405292 Performance Comparison of Cooperative Banks in the EU, USA and Canada
Authors: Matěj Kuc
Abstract:
This paper compares different types of profitability measures of cooperative banks from two developed regions: the European Union and the United States of America together with Canada. We created balanced dataset of more than 200 cooperative banks covering 2011-2016 period. We made series of tests and run Random Effects estimation on panel data. We found that American and Canadian cooperatives are more profitable in terms of return on assets (ROA) and return on equity (ROE). There is no significant difference in net interest margin (NIM). Our results show that the North American cooperative banks accommodated better to the current market environment.Keywords: cooperative banking, panel data, profitability measures, random effects
Procedia PDF Downloads 1135291 Combined Influence of Charge Carrier Density and Temperature on Open-Circuit Voltage in Bulk Heterojunction Organic Solar Cells
Authors: Douglas Yeboah, Monishka Narayan, Jai Singh
Abstract:
One of the key parameters in determining the power conversion efficiency (PCE) of organic solar cells (OSCs) is the open-circuit voltage, however, it is still not well understood. In order to examine the performance of OSCs, it is necessary to understand the losses associated with the open-circuit voltage and how best it can be improved. Here, an analytical expression for the open-circuit voltage of bulk heterojunction (BHJ) OSCs is derived from the charge carrier densities without considering the drift-diffusion current. The open-circuit voltage thus obtained is dependent on the donor-acceptor band gap, the energy difference between the highest occupied molecular orbital (HOMO) and the hole quasi-Fermi level of the donor material, temperature, the carrier density (electrons), the generation rate of free charge carriers and the bimolecular recombination coefficient. It is found that open-circuit voltage increases when the carrier density increases and when the temperature decreases. The calculated results are discussed in view of experimental results and agree with them reasonably well. Overall, this work proposes an alternative pathway for improving the open-circuit voltage in BHJ OSCs.Keywords: charge carrier density, open-circuit voltage, organic solar cells, temperature
Procedia PDF Downloads 3735290 Fabrication of 2D Nanostructured Hybrid Material-Based Devices for High-Performance Supercapacitor Energy Storage
Authors: Sunil Kumar, Vinay Kumar, Mamta Bulla, Rita Dahiya
Abstract:
Supercapacitors have emerged as a leading energy storage technology, gaining popularity in applications like digital telecommunications, memory backup, and hybrid electric vehicles. Their appeal lies in a long cycle life, high power density, and rapid recharge capabilities. These exceptional traits attract researchers aiming to develop advanced, cost-effective, and high-energy-density electrode materials for next-generation energy storage solutions. Two-dimensional (2D) nanostructures are highly attractive for fabricating nanodevices due to their high surface-to-volume ratio and good compatibility with device design. In the current study, a composite was synthesized by combining MoS2 with reduced graphene oxide (rGO) under optimal conditions and characterized using various techniques, including XRD, FTIR, SEM and XPS. The electrochemical properties of the composite material were assessed through cyclic voltammetry, galvanostatic charging-discharging and electrochemical impedance spectroscopy. The supercapacitor device demonstrated a specific capacitance of 153 F g-1 at a current density of 1 Ag-1, achieving an excellent energy density of 30.5 Wh kg-1 and a power density of 600 W kg-1. Additionally, it maintained excellent cyclic stability over 5000 cycles, establishing it as a promising candidate for efficient and durable energy storage solutions. These findings highlight the dynamic relationship between electrode materials and offer valuable insights for the development and enhancement of high-performance symmetric devices.Keywords: 2D material, energy density, galvanostatic charge-discharge, hydrothermal reactor, specific capacitance
Procedia PDF Downloads 145289 Simultaneous Determination of Six Characterizing/Quality Parameters of Biodiesels via 1H NMR and Multivariate Calibration
Authors: Gustavo G. Shimamoto, Matthieu Tubino
Abstract:
The characterization and the quality of biodiesel samples are checked by determining several parameters. Considering a large number of analysis to be performed, as well as the disadvantages of the use of toxic solvents and waste generation, multivariate calibration is suggested to reduce the number of tests. In this work, hydrogen nuclear magnetic resonance (1H NMR) spectra were used to build multivariate models, from partial least squares (PLS) regression, in order to determine simultaneously six important characterizing and/or quality parameters of biodiesels: density at 20 ºC, kinematic viscosity at 40 ºC, iodine value, acid number, oxidative stability, and water content. Biodiesels from twelve different oils sources were used in this study: babassu, brown flaxseed, canola, corn, cottonseed, macauba almond, microalgae, palm kernel, residual frying, sesame, soybean, and sunflower. 1H NMR reflects the structures of the compounds present in biodiesel samples and showed suitable correlations with the six parameters. The PLS models were constructed with latent variables between 5 and 7, the obtained values of r(cal) and r(val) were greater than 0.994 and 0.989, respectively. In addition, the models were considered suitable to predict all the six parameters for external samples, taking into account the analytical speed to perform it. Thus, the alliance between 1H NMR and PLS showed to be appropriate to characterize and evaluate the quality of biodiesels, reducing significantly analysis time, the consumption of reagents/solvents, and waste generation. Therefore, the proposed methods can be considered to adhere to the principles of green chemistry.Keywords: biodiesel, multivariate calibration, nuclear magnetic resonance, quality parameters
Procedia PDF Downloads 5395288 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier
Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh
Abstract:
This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems
Procedia PDF Downloads 435287 Evaluation of Random Forest and Support Vector Machine Classification Performance for the Prediction of Early Multiple Sclerosis from Resting State FMRI Connectivity Data
Authors: V. Saccà, A. Sarica, F. Novellino, S. Barone, T. Tallarico, E. Filippelli, A. Granata, P. Valentino, A. Quattrone
Abstract:
The work aim was to evaluate how well Random Forest (RF) and Support Vector Machine (SVM) algorithms could support the early diagnosis of Multiple Sclerosis (MS) from resting-state functional connectivity data. In particular, we wanted to explore the ability in distinguishing between controls and patients of mean signals extracted from ICA components corresponding to 15 well-known networks. Eighteen patients with early-MS (mean-age 37.42±8.11, 9 females) were recruited according to McDonald and Polman, and matched for demographic variables with 19 healthy controls (mean-age 37.55±14.76, 10 females). MRI was acquired by a 3T scanner with 8-channel head coil: (a)whole-brain T1-weighted; (b)conventional T2-weighted; (c)resting-state functional MRI (rsFMRI), 200 volumes. Estimated total lesion load (ml) and number of lesions were calculated using LST-toolbox from the corrected T1 and FLAIR. All rsFMRIs were pre-processed using tools from the FMRIB's Software Library as follows: (1) discarding of the first 5 volumes to remove T1 equilibrium effects, (2) skull-stripping of images, (3) motion and slice-time correction, (4) denoising with high-pass temporal filter (128s), (5) spatial smoothing with a Gaussian kernel of FWHM 8mm. No statistical significant differences (t-test, p < 0.05) were found between the two groups in the mean Euclidian distance and the mean Euler angle. WM and CSF signal together with 6 motion parameters were regressed out from the time series. We applied an independent component analysis (ICA) with the GIFT-toolbox using the Infomax approach with number of components=21. Fifteen mean components were visually identified by two experts. The resulting z-score maps were thresholded and binarized to extract the mean signal of the 15 networks for each subject. Statistical and machine learning analysis were then conducted on this dataset composed of 37 rows (subjects) and 15 features (mean signal in the network) with R language. The dataset was randomly splitted into training (75%) and test sets and two different classifiers were trained: RF and RBF-SVM. We used the intrinsic feature selection of RF, based on the Gini index, and recursive feature elimination (rfe) for the SVM, to obtain a rank of the most predictive variables. Thus, we built two new classifiers only on the most important features and we evaluated the accuracies (with and without feature selection) on test-set. The classifiers, trained on all the features, showed very poor accuracies on training (RF:58.62%, SVM:65.52%) and test sets (RF:62.5%, SVM:50%). Interestingly, when feature selection by RF and rfe-SVM were performed, the most important variable was the sensori-motor network I in both cases. Indeed, with only this network, RF and SVM classifiers reached an accuracy of 87.5% on test-set. More interestingly, the only misclassified patient resulted to have the lowest value of lesion volume. We showed that, with two different classification algorithms and feature selection approaches, the best discriminant network between controls and early MS, was the sensori-motor I. Similar importance values were obtained for the sensori-motor II, cerebellum and working memory networks. These findings, in according to the early manifestation of motor/sensorial deficits in MS, could represent an encouraging step toward the translation to the clinical diagnosis and prognosis.Keywords: feature selection, machine learning, multiple sclerosis, random forest, support vector machine
Procedia PDF Downloads 2405286 A Sequential Approach for Random-Effects Meta-Analysis
Authors: Samson Henry Dogo, Allan Clark, Elena Kulinskaya
Abstract:
The objective in meta-analysis is to combine results from several independent studies in order to create generalization and provide evidence based for decision making. But recent studies show that the magnitude of effect size estimates reported in many areas of research finding changed with year publication and this can impair the results and conclusions of meta-analysis. A number of sequential methods have been proposed for monitoring the effect size estimates in meta-analysis. However they are based on statistical theory applicable to fixed effect model (FEM). For random-effects model (REM), the analysis incorporates the heterogeneity variance, tau-squared and its estimation create complications. In this paper proposed the use of Gombay and Serbian (2005) truncated CUSUM-type test with asymptotically valid critical values for sequential monitoring of REM. Simulation results show that the test does not control the Type I error well, and is not recommended. Further work required to derive an appropriate test in this important area of application.Keywords: meta-analysis, random-effects model, sequential test, temporal changes in effect sizes
Procedia PDF Downloads 4675285 Contribution to the Study of the Rill Density Effects on Soil Erosion: Laboratory Experiments
Authors: L. Mouzai, M. Bouhadef
Abstract:
Rills begin to be generated once overland flow shear capacity overcomes the soil surface resistance. This resistance depends on soil texture, the arrangement of soil particles and on chemical and physical properties. The rill density could affect soil erosion, especially when the distance between the rills (interrill) contributes to the variation of the rill characteristics, and consequently on sediment concentration. To investigate this point, agricultural sandy soil, a soil tray of 0.2x1x3m³ and a piece of hardwood rectangular in shape to build up rills were the base of this work. The results have shown that small lines have been developed between the rills and the flow acceleration increased in comparison to the flow on the flat surface (interrill). Sediment concentration increased with increasing rill number (density).Keywords: artificial rainfall, experiments, rills, soil erosion, transport capacity
Procedia PDF Downloads 1645284 Development of High Temperature Mo-Si-B Based In-situ Composites
Authors: Erhan Ayas, Buse Katipoğlu, Eda Metin, Rifat Yılmaz
Abstract:
The search for new materials has begun to be used even higher than the service temperature (~1150ᵒC) where nickel-based superalloys are currently used. This search should also meet the increasing demands for energy efficiency improvements. The materials studied for aerospace applications are expected to have good oxidation resistance. Mo-Si-B alloys, which have higher operating temperatures than nickel-based superalloys, are candidates for ultra-high temperature materials used in gas turbine and jet engines. Because the Moss and Mo₅SiB₂ (T2) phases exhibit high melting temperature, excellent high-temperature creep strength and oxidation resistance properties, however, low fracture toughness value at room temperature is a disadvantage for these materials, but this feature can be improved with optimum Moss phase and microstructure control. High-density value is also a problem for structural parts. For example, in turbine rotors, the higher the weight, the higher the centrifugal force, which reduces the creep life of the material. The density value of the nickel-based superalloys and the T2 phase, which is the Mo-Si-B alloy phase, is in the range of 8.6 - 9.2 g/cm³. But under these conditions, T2 phase Moss (density value 10.2 g/cm³), this value is above the density value of nickel-based superalloys. So, with some ceramic-based contributions, this value is enhanced by optimum values.Keywords: molybdenum, composites, in-situ, mmc
Procedia PDF Downloads 665283 Discontinuous Spacetime with Vacuum Holes as Explanation for Gravitation, Quantum Mechanics and Teleportation
Authors: Constantin Z. Leshan
Abstract:
Hole Vacuum theory is based on discontinuous spacetime that contains vacuum holes. Vacuum holes can explain gravitation, some laws of quantum mechanics and allow teleportation of matter. All massive bodies emit a flux of holes which curve the spacetime; if we increase the concentration of holes, it leads to length contraction and time dilation because the holes do not have the properties of extension and duration. In the limited case when space consists of holes only, the distance between every two points is equal to zero and time stops - outside of the Universe, the extension and duration properties do not exist. For this reason, the vacuum hole is the only particle in physics capable of describing gravitation using its own properties only. All microscopic particles must 'jump' continually and 'vibrate' due to the appearance of holes (impassable microscopic 'walls' in space), and it is the cause of the quantum behavior. Vacuum holes can explain the entanglement, non-locality, wave properties of matter, tunneling, uncertainty principle and so on. Particles do not have trajectories because spacetime is discontinuous and has impassable microscopic 'walls' due to the simple mechanical motion is impossible at small scale distances; it is impossible to 'trace' a straight line in the discontinuous spacetime because it contains the impassable holes. Spacetime 'boils' continually due to the appearance of the vacuum holes. For teleportation to be possible, we must send a body outside of the Universe by enveloping it with a closed surface consisting of vacuum holes. Since a material body cannot exist outside of the Universe, it reappears instantaneously in a random point of the Universe. Since a body disappears in one volume and reappears in another random volume without traversing the physical space between them, such a transportation method can be called teleportation (or Hole Teleportation). It is shown that Hole Teleportation does not violate causality and special relativity due to its random nature and other properties. Although Hole Teleportation has a random nature, it can be used for colonization of extrasolar planets by the help of the method called 'random jumps': after a large number of random teleportation jumps, there is a probability that the spaceship may appear near a habitable planet. We can create vacuum holes experimentally using the method proposed by Descartes: we must remove a body from the vessel without permitting another body to occupy this volume.Keywords: border of the Universe, causality violation, perfect isolation, quantum jumps
Procedia PDF Downloads 4255282 3D Electrode Carrier and its Implications on Retinal Implants
Authors: Diego Luján Villarreal
Abstract:
Retinal prosthetic devices aim to repair some vision in visual impairment patients by stimulating electrically neural cells in the visual system. In this study, the 3D linear electrode carrier is presented. A simulation framework was developed by placing the 3D carrier 1 mm away from the fovea center at the highest-density cell. Cell stimulation is verified in COMSOL Multiphysics by developing a 3D computational model which includes the relevant retinal interface elements and dynamics of the voltage-gated ionic channels. Current distribution resulting from low threshold amplitudes produces a small volume equivalent to the volume confined by individual cells at the highest-density cell using small-sized electrodes. Delicate retinal tissue is protected by excessive charge densityKeywords: retinal prosthetic devices, visual devices, retinal implants., visual prosthetic devices
Procedia PDF Downloads 1125281 Hydrothermal Synthesis of Carbon Sphere/Nickel Cobalt Sulfide Core/Shell Microstructure and Its Electrochemical Performance
Authors: Charmaine Lamiel, Van Hoa Nguyen, Marjorie Baynosa, Jae-Jin Shim
Abstract:
Electrochemical supercapacitors have attracted considerable attention because of their high potential as an efficient energy storage system. The combination of carbon-based material and transition metal oxides/sulfides are studied because they have long and improved cycle life as well as high energy and power densities. In this study, a hierarchical mesoporous carbon sphere/nickel cobalt sulfide (CS/Ni-Co-S) core/shell structure was synthesized using a facile hydrothermal method without any further sulfurization or post-heat treatment. The CS/Ni-Co-S core/shell microstructures exhibited a high capacitance of 724 F g−1 at 2 A g−1 in a 6 M KOH electrolyte. After 2000 charge-discharge cycles, it retained 86.1% of its original capacitance, with high Coulombic efficiency of 97.9%. The electrode exhibited a high energy density of 58.0 Wh kg−1 at an energy density of 1440 W kg−1, and high power density of 7200 W kg−1 at an energy density of 34.2 Wh kg−1. The successful synthesis was considered to be simple and cost-effective which supports the viability of this composite as an alternative activated material for high performance supercapacitors.Keywords: carbon sphere, electrochemical, hydrothermal, nickel cobalt sulfide, supercapacitor
Procedia PDF Downloads 3025280 Application of Random Forest Model in The Prediction of River Water Quality
Authors: Turuganti Venkateswarlu, Jagadeesh Anmala
Abstract:
Excessive runoffs from various non-point source land uses, and other point sources are rapidly contaminating the water quality of streams in the Upper Green River watershed, Kentucky, USA. It is essential to maintain the stream water quality as the river basin is one of the major freshwater sources in this province. It is also important to understand the water quality parameters (WQPs) quantitatively and qualitatively along with their important features as stream water is sensitive to climatic events and land-use practices. In this paper, a model was developed for predicting one of the significant WQPs, Fecal Coliform (FC) from precipitation, temperature, urban land use factor (ULUF), agricultural land use factor (ALUF), and forest land-use factor (FLUF) using Random Forest (RF) algorithm. The RF model, a novel ensemble learning algorithm, can even find out advanced feature importance characteristics from the given model inputs for different combinations. This model’s outcomes showed a good correlation between FC and climate events and land use factors (R2 = 0.94) and precipitation and temperature are the primary influencing factors for FC.Keywords: water quality, land use factors, random forest, fecal coliform
Procedia PDF Downloads 1975279 Enhancement of Mechanical and Biological Properties in Wollastonite Bioceramics by MgSiO3 Addition
Authors: Jae Hong Kim, Sang Cheol Um, Jong Kook Lee
Abstract:
Strong and biocompatible wollastonite (CaSiO3) was fabricated by pressureless sintering at temperature range of 1250~ 1300 ℃ and phase transition of to β-wollastonite with an addition of MgSiO3. Starting pure α-wollastonite powder were prepared by solid state reaction, and MgSiO3 powder was added to α-wollastonite powder to induce the phase transition α to β-wollastonite over 1250℃. Sintered wollastonite samples at 1250℃ with 5 and 10 wt% MgSiO3 were α+β phase and β phase respectively, and showed higher densification rate than that of α or β-wollastonite, which are almost the same as the theoretical density. Hardness and Young’s modulus of sintered wollastonite were dependent on the apparent density and the amount of β-wollastonite. Young’s modulus (78GPa) of β-wollastonite added 10 wt% MgSiO3 was almost double time of sintered α-wollastonite. From the in-vitro test, biphasic (α+β) wollastonite with 5wt% MgSiO3 addition had good bioactivity in simulated body fluid solution.Keywords: β-wollastonite, high density, MgSiO3, phase transition
Procedia PDF Downloads 5815278 Tsunami Wave Height and Flow Velocity Calculations Based on Density Measurements of Boulders: Case Studies from Anegada and Pakarang Cape
Authors: Zakiul Fuady, Michaela Spiske
Abstract:
Inundation events, such as storms and tsunamis can leave onshore sedimentary evidence like sand deposits or large boulders. These deposits store indirect information on the related inundation parameters (e.g., flow velocity, flow depth, wave height). One tool to reveal these parameters are inverse models that use the physical characteristics of the deposits to refer to the magnitude of inundation. This study used boulders of the 2004 Indian Ocean Tsunami from Thailand (Pakarang Cape) and form a historical tsunami event that inundated the outer British Virgin Islands (Anegada). For the largest boulder found in Pakarang Cape with a volume of 26.48 m³ the required tsunami wave height is 0.44 m and storm wave height are 1.75 m (for a bulk density of 1.74 g/cm³. In Pakarang Cape the highest tsunami wave height is 0.45 m and storm wave height are 1.8 m for transporting a 20.07 m³ boulder. On Anegada, the largest boulder with a diameter of 2.7 m is the asingle coral head (species Diploria sp.) with a bulk density of 1.61 g/cm³, and requires a minimum tsunami wave height of 0.31 m and storm wave height of 1.25 m. The highest required tsunami wave height on Anegada is 2.12 m for a boulder with a bulk density of 2.46 g/cm³ (volume 0.0819 m³) and the highest storm wave height is 5.48 m (volume 0.216 m³) from the same bulk density and the coral type is limestone. Generally, the higher the bulk density, volume, and weight of the boulders, the higher the minimum tsunami and storm wave heights required to initiate transport. It requires 4.05 m/s flow velocity by Nott’s equation (2003) and 3.57 m/s by Nandasena et al. (2011) to transport the largest boulder in Pakarang Cape, whereas on Anegada, it requires 3.41 m/s to transport a boulder with diameter 2.7 m for both equations. Thus, boulder equations need to be handled with caution because they make many assumptions and simplifications. Second, the physical boulder parameters, such as density and volume need to be determined carefully to minimize any errors.Keywords: tsunami wave height, storm wave height, flow velocity, boulders, Anegada, Pakarang Cape
Procedia PDF Downloads 2375277 Secure Watermarking not at the Cost of Low Robustness
Authors: Jian Cao
Abstract:
This paper describes a novel watermarking technique which we call the random direction embedding (RDE) watermarking. Unlike traditional watermarking techniques, the watermark energy after the RDE embedding does not focus on a fixed direction, leading to the security against the traditional unauthorized watermark removal attack. In addition, the experimental results show that when compared with the existing secure watermarking, namely natural watermarking (NW), the RDE watermarking gains significant improvement in terms of robustness. In fact, the security of the RDE watermarking is not at the cost of low robustness, and it can even achieve more robust than the traditional spread spectrum watermarking, which has been shown to be very insecure.Keywords: robustness, spread spectrum watermarking, watermarking security, random direction embedding (RDE)
Procedia PDF Downloads 3835276 A Study of Non Linear Partial Differential Equation with Random Initial Condition
Authors: Ayaz Ahmad
Abstract:
In this work, we present the effect of noise on the solution of a partial differential equation (PDE) in three different setting. We shall first consider random initial condition for two nonlinear dispersive PDE the non linear Schrodinger equation and the Kortteweg –de vries equation and analyse their effect on some special solution , the soliton solutions.The second case considered a linear partial differential equation , the wave equation with random initial conditions allow to substantially decrease the computational and data storage costs of an algorithm to solve the inverse problem based on the boundary measurements of the solution of this equation. Finally, the third example considered is that of the linear transport equation with a singular drift term, when we shall show that the addition of a multiplicative noise term forbids the blow up of solutions under a very weak hypothesis for which we have finite time blow up of a solution in the deterministic case. Here we consider the problem of wave propagation, which is modelled by a nonlinear dispersive equation with noisy initial condition .As observed noise can also be introduced directly in the equations.Keywords: drift term, finite time blow up, inverse problem, soliton solution
Procedia PDF Downloads 2155275 Calibration of Hybrid Model and Arbitrage-Free Implied Volatility Surface
Authors: Kun Huang
Abstract:
This paper investigates whether the combination of local and stochastic volatility models can be calibrated exactly to any arbitrage-free implied volatility surface of European option. The risk neutral Brownian Bridge density is applied for calibration of the leverage function of our Hybrid model. Furthermore, the tails of marginal risk neutral density are generated by Generalized Extreme Value distribution in order to capture the properties of asset returns. The local volatility is generated from the arbitrage-free implied volatility surface using stochastic volatility inspired parameterization.Keywords: arbitrage free implied volatility, calibration, extreme value distribution, hybrid model, local volatility, risk-neutral density, stochastic volatility
Procedia PDF Downloads 2675274 An Innovative High Energy Density Power Pack for Portable and Off-Grid Power Applications
Authors: Idit Avrahami, Alex Schechter, Lev Zakhvatkin
Abstract:
This research focuses on developing a compact and light Hydrogen Generator (HG), coupled with fuel cells (FC) to provide a High-Energy-Density Power-Pack (HEDPP) solution, which is 10 times Li-Ion batteries. The HEDPP is designed for portable & off-grid power applications such as Drones, UAVs, stationary off-grid power sources, unmanned marine vehicles, and more. Hydrogen gas provided by this device is delivered in the safest way as a chemical powder at room temperature and ambient pressure is activated only when the power is on. Hydrogen generation is based on a stabilized chemical reaction of Sodium Borohydride (SBH) and water. The proposed solution enables a ‘No Storage’ Hydrogen-based Power Pack. Hydrogen is produced and consumed on-the-spot, during operation; therefore, there’s no need for high-pressure hydrogen tanks, which are large, heavy, and unsafe. In addition to its high energy density, ease of use, and safety, the presented power pack has a significant advantage of versatility and deployment in numerous applications and scales. This patented HG was demonstrated using several prototypes in our lab and was proved to be feasible and highly efficient for several applications. For example, in applications where water is available (such as marine vehicles, water and sewage infrastructure, and stationary applications), the Energy Density of the suggested power pack may reach 2700-3000 Wh/kg, which is again more than 10 times higher than conventional lithium-ion batteries. In other applications (e.g., UAV or small vehicles) the energy density may exceed 1000 Wh/kg.Keywords: hydrogen energy, sodium borohydride, fixed-wing UAV, energy pack
Procedia PDF Downloads 825273 Efficient Signcryption Scheme with Provable Security for Smart Card
Authors: Jayaprakash Kar, Daniyal M. Alghazzawi
Abstract:
The article proposes a novel construction of signcryption scheme with provable security which is most suited to implement on smart card. It is secure in random oracle model and the security relies on Decisional Bilinear Diffie-Hellmann Problem. The proposed scheme is secure against adaptive chosen ciphertext attack (indistiguishbility) and adaptive chosen message attack (unforgebility). Also, it is inspired by zero-knowledge proof. The two most important security goals for smart card are Confidentiality and authenticity. These functions are performed in one logical step in low computational cost.Keywords: random oracle, provable security, unforgebility, smart card
Procedia PDF Downloads 5935272 Detection Characteristics of the Random and Deterministic Signals in Antenna Arrays
Authors: Olesya Bolkhovskaya, Alexey Davydov, Alexander Maltsev
Abstract:
In this paper approach to incoherent signal detection in multi-element antenna array are researched and modeled. Two types of useful signals with unknown wavefront were considered. First one is deterministic (Barker code), the second one is random (Gaussian distribution). The derivation of the sufficient statistics took into account the linearity of the antenna array. The performance characteristics and detecting curves are modeled and compared for different useful signals parameters and for different number of elements of the antenna array. Results of researches in case of some additional conditions can be applied to a digital communications systems.Keywords: antenna array, detection curves, performance characteristics, quadrature processing, signal detection
Procedia PDF Downloads 4055271 Modeling Thermionic Emission from Carbon Nanotubes with Modified Richardson-Dushman Equation
Authors: Olukunle C. Olawole, Dilip Kumar De
Abstract:
We have modified Richardson-Dushman equation considering thermal expansion of lattice and change of chemical potential with temperature in material. The corresponding modified Richardson-Dushman (MRDE) equation fits quite well the experimental data of thermoelectronic current density (J) vs T from carbon nanotubes. It provides a unique technique for accurate determination of W0 Fermi energy, EF0 at 0 K and linear thermal expansion coefficient of carbon nano-tube in good agreement with experiment. From the value of EF0 we obtain the charge carrier density in excellent agreement with experiment. We describe application of the equations for the evaluation of performance of concentrated solar thermionic energy converter (STEC) with emitter made of carbon nanotube for future applications.Keywords: carbon nanotube, modified Richardson-Dushman equation, fermi energy at 0 K, charge carrier density
Procedia PDF Downloads 3785270 Optimization of Reliability and Communicability of a Random Two-Dimensional Point Patterns Using Delaunay Triangulation
Authors: Sopheak Sorn, Kwok Yip Szeto
Abstract:
Reliability is one of the important measures of how well the system meets its design objective, and mathematically is the probability that a complex system will perform satisfactorily. When the system is described by a network of N components (nodes) and their L connection (links), the reliability of the system becomes a network design problem that is an NP-hard combinatorial optimization problem. In this paper, we address the network design problem for a random point set’s pattern in two dimensions. We make use of a Voronoi construction with each cell containing exactly one point in the point pattern and compute the reliability of the Voronoi’s dual, i.e. the Delaunay graph. We further investigate the communicability of the Delaunay network. We find that there is a positive correlation and a negative correlation between the homogeneity of a Delaunay's degree distribution with its reliability and its communicability respectively. Based on the correlations, we alter the communicability and the reliability by performing random edge flips, which preserve the number of links and nodes in the network but can increase the communicability in a Delaunay network at the cost of its reliability. This transformation is later used to optimize a Delaunay network with the optimum geometric mean between communicability and reliability. We also discuss the importance of the edge flips in the evolution of real soap froth in two dimensions.Keywords: Communicability, Delaunay triangulation, Edge Flip, Reliability, Two dimensional network, Voronio
Procedia PDF Downloads 4195269 Multimodal Optimization of Density-Based Clustering Using Collective Animal Behavior Algorithm
Authors: Kristian Bautista, Ruben A. Idoy
Abstract:
A bio-inspired metaheuristic algorithm inspired by the theory of collective animal behavior (CAB) was integrated to density-based clustering modeled as multimodal optimization problem. The algorithm was tested on synthetic, Iris, Glass, Pima and Thyroid data sets in order to measure its effectiveness relative to CDE-based Clustering algorithm. Upon preliminary testing, it was found out that one of the parameter settings used was ineffective in performing clustering when applied to the algorithm prompting the researcher to do an investigation. It was revealed that fine tuning distance δ3 that determines the extent to which a given data point will be clustered helped improve the quality of cluster output. Even though the modification of distance δ3 significantly improved the solution quality and cluster output of the algorithm, results suggest that there is no difference between the population mean of the solutions obtained using the original and modified parameter setting for all data sets. This implies that using either the original or modified parameter setting will not have any effect towards obtaining the best global and local animal positions. Results also suggest that CDE-based clustering algorithm is better than CAB-density clustering algorithm for all data sets. Nevertheless, CAB-density clustering algorithm is still a good clustering algorithm because it has correctly identified the number of classes of some data sets more frequently in a thirty trial run with a much smaller standard deviation, a potential in clustering high dimensional data sets. Thus, the researcher recommends further investigation in the post-processing stage of the algorithm.Keywords: clustering, metaheuristics, collective animal behavior algorithm, density-based clustering, multimodal optimization
Procedia PDF Downloads 2305268 Experimental Study on Improving the Engineering Properties of Sand Dunes Using Random Fibers-Geogrid Reinforcement
Authors: Adel M. Belal, Sameh Abu El-Soud, Mariam Farid
Abstract:
This study presents the effect of reinforcement inclusions (fibers-geogrids) on fine sand bearing capacity under strip footings. Experimental model tests were carried out using a rectangular plates [(10cm x 38 cm), (7.5 cm x 38 cm), and (12.5 cm x 38 cm)] with a geogrids and randomly reinforced fibers. The width and depth of the geogrid were varied to determine their effects on the engineering properties of treated poorly graded fine sand. Laboratory model test results for the ultimate stresses and the settlement of a rigid strip foundation supported by single and multi-layered fiber-geogrid-reinforced sand are presented. The number of layers of geogrid was varied between 1 to 4. The effect of the first geogrid reinforcement depth, the spacing between the reinforcement and its length on the bearing capacity is investigated by experimental program. Results show that the use of flexible random fibers with a content of 0.125% by weight of the treated sand dunes, with 3 geogrid reinforcement layers, u/B= 0.25 and L/B=7.5, has a significant increase in the bearing capacity of the proposed system.Keywords: earth reinforcement, geogrid, random fiber, reinforced soil
Procedia PDF Downloads 3125267 Comparison of Multivariate Adaptive Regression Splines and Random Forest Regression in Predicting Forced Expiratory Volume in One Second
Authors: P. V. Pramila , V. Mahesh
Abstract:
Pulmonary Function Tests are important non-invasive diagnostic tests to assess respiratory impairments and provides quantifiable measures of lung function. Spirometry is the most frequently used measure of lung function and plays an essential role in the diagnosis and management of pulmonary diseases. However, the test requires considerable patient effort and cooperation, markedly related to the age of patients esulting in incomplete data sets. This paper presents, a nonlinear model built using Multivariate adaptive regression splines and Random forest regression model to predict the missing spirometric features. Random forest based feature selection is used to enhance both the generalization capability and the model interpretability. In the present study, flow-volume data are recorded for N= 198 subjects. The ranked order of feature importance index calculated by the random forests model shows that the spirometric features FVC, FEF 25, PEF,FEF 25-75, FEF50, and the demographic parameter height are the important descriptors. A comparison of performance assessment of both models prove that, the prediction ability of MARS with the `top two ranked features namely the FVC and FEF 25 is higher, yielding a model fit of R2= 0.96 and R2= 0.99 for normal and abnormal subjects. The Root Mean Square Error analysis of the RF model and the MARS model also shows that the latter is capable of predicting the missing values of FEV1 with a notably lower error value of 0.0191 (normal subjects) and 0.0106 (abnormal subjects). It is concluded that combining feature selection with a prediction model provides a minimum subset of predominant features to train the model, yielding better prediction performance. This analysis can assist clinicians with a intelligence support system in the medical diagnosis and improvement of clinical care.Keywords: FEV, multivariate adaptive regression splines pulmonary function test, random forest
Procedia PDF Downloads 3105266 Second Order Statistics of Dynamic Response of Structures Using Gamma Distributed Damping Parameters
Authors: Badreddine Chemali, Boualem Tiliouine
Abstract:
This article presents the main results of a numerical investigation on the uncertainty of dynamic response of structures with statistically correlated random damping Gamma distributed. A computational method based on a Linear Statistical Model (LSM) is implemented to predict second order statistics for the response of a typical industrial building structure. The significance of random damping with correlated parameters and its implications on the sensitivity of structural peak response in the neighborhood of a resonant frequency are discussed in light of considerable ranges of damping uncertainties and correlation coefficients. The results are compared to those generated using Monte Carlo simulation techniques. The numerical results obtained show the importance of damping uncertainty and statistical correlation of damping coefficients when obtaining accurate probabilistic estimates of dynamic response of structures. Furthermore, the effectiveness of the LSM model to efficiently predict uncertainty propagation for structural dynamic problems with correlated damping parameters is demonstrated.Keywords: correlated random damping, linear statistical model, Monte Carlo simulation, uncertainty of dynamic response
Procedia PDF Downloads 2805265 Hydraulic Characteristics of the Tidal River Dongcheon in Busan City
Authors: Young Man Cho, Sang Hyun Kim
Abstract:
Even though various management practices such as sediment dredging were attempted to improve water quality of Dongcheon located in Busan, the environmental condition of this stream was deteriorated. Therefore, Busan metropolitan city had pumped and diverted sea water to upstream of Dongcheon for several years. This study explored hydraulic characteristics of Dongcheon to configure the best management practice for ecological restoration and water quality improvement of a man-made urban stream. Intensive field investigation indicates that average flow velocities at depths of 20% and 80% from the water surface ranged 5 to 10 cm/s and 2 to 5 cm/s, respectively. Concentrations of dissolved oxygen for all depths were less than 0.25 mg/l during low tidal period. Even though density difference can be found along stream depth, density current seems rarely generated in Dongcheon. Short period of high tidal portion and shallow depths are responsible for well-mixing nature of Doncheon.Keywords: hydraulic, tidal river, density current, sea water
Procedia PDF Downloads 225