Search results for: Gaussian Conditional Random Field
10349 Numerical Investigation of Hybrid Ferrofluid Unsteady Flow through Porous Channel
Authors: Wajahat Hussain Khan, M. Zubair Akbar Qureshi
Abstract:
The viscous, two-dimensional, incompressible, and laminar time-dependent heat transfer flow through a ferromagnetic fluid is considered in this paper. Flow takes place in a channel between two porous walls under the influence of the magnetic field located beyond the channel. It is assumed that there are no electric field effects and the variation in the magnetic field vector that could occur within the FKeywords: hybrid ferrofluid, heat transfer, magnetic field, porous channel
Procedia PDF Downloads 17710348 Deterministic Random Number Generator Algorithm for Cryptosystem Keys
Authors: Adi A. Maaita, Hamza A. A. Al Sewadi
Abstract:
One of the crucial parameters of digital cryptographic systems is the selection of the keys used and their distribution. The randomness of the keys has a strong impact on the system’s security strength being difficult to be predicted, guessed, reproduced or discovered by a cryptanalyst. Therefore, adequate key randomness generation is still sought for the benefit of stronger cryptosystems. This paper suggests an algorithm designed to generate and test pseudo random number sequences intended for cryptographic applications. This algorithm is based on mathematically manipulating a publically agreed upon information between sender and receiver over a public channel. This information is used as a seed for performing some mathematical functions in order to generate a sequence of pseudorandom numbers that will be used for encryption/decryption purposes. This manipulation involves permutations and substitutions that fulfills Shannon’s principle of “confusion and diffusion”. ASCII code characters wereutilized in the generation process instead of using bit strings initially, which adds more flexibility in testing different seed values. Finally, the obtained results would indicate sound difficulty of guessing keys by attackers.Keywords: cryptosystems, information security agreement, key distribution, random numbers
Procedia PDF Downloads 26810347 The Richtmyer-Meshkov Instability Impacted by the Interface with Different Components Distribution
Authors: Sheng-Bo Zhang, Huan-Hao Zhang, Zhi-Hua Chen, Chun Zheng
Abstract:
In this paper, the Richtmyer-Meshkov instability has been studied numerically by using the high-resolution Roe scheme based on the two-dimensional unsteady Euler equation, which was caused by the interaction between shock wave and the helium circular light gas cylinder with different component distributions. The numerical results further discuss the deformation process of the gas cylinder, the wave structure of the flow field and quantitatively analyze the characteristic dimensions (length, height, and central axial width) of the gas cylinder, the volume compression ratio of the cylinder over time. In addition, the flow mechanism of shock-driven interface gas mixing is analyzed from multiple perspectives by combining it with the flow field pressure, velocity, circulation, and gas mixing rate. Then the effects of different initial component distribution conditions on interface instability are investigated. The results show when the diffusion interface transit to the sharp interface, the reflection coefficient gradually increases on both sides of the interface. When the incident shock wave interacts with the cylinder, the transmission of the shock wave will transit from conventional transmission to unconventional transmission. At the same time, the reflected shock wave is gradually strengthened, and the transmitted shock wave is gradually weakened, which leads to an increase in the Richtmyer-Meshkov instability. Moreover, the Atwood number on both sides of the interface also increases as the diffusion interface transit to the sharp interface, which leads to an increase in the Rayleigh-Taylor instability and the Kelvin-Helmholtz instability. Therefore, the increase in instability will lead to an increase the circulation, resulting in an increase in the growth rate of gas mixing rate.Keywords: shock wave, He light cylinder, Richtmyer-Meshkov instability, Gaussian distribution
Procedia PDF Downloads 7710346 Using Predictive Analytics to Identify First-Year Engineering Students at Risk of Failing
Authors: Beng Yew Low, Cher Liang Cha, Cheng Yong Teoh
Abstract:
Due to a lack of continual assessment or grade related data, identifying first-year engineering students in a polytechnic education at risk of failing is challenging. Our experience over the years tells us that there is no strong correlation between having good entry grades in Mathematics and the Sciences and excelling in hardcore engineering subjects. Hence, identifying students at risk of failure cannot be on the basis of entry grades in Mathematics and the Sciences alone. These factors compound the difficulty of early identification and intervention. This paper describes the development of a predictive analytics model in the early detection of students at risk of failing and evaluates its effectiveness. Data from continual assessments conducted in term one, supplemented by data of student psychological profiles such as interests and study habits, were used. Three classification techniques, namely Logistic Regression, K Nearest Neighbour, and Random Forest, were used in our predictive model. Based on our findings, Random Forest was determined to be the strongest predictor with an Area Under the Curve (AUC) value of 0.994. Correspondingly, the Accuracy, Precision, Recall, and F-Score were also highest among these three classifiers. Using this Random Forest Classification technique, students at risk of failure could be identified at the end of term one. They could then be assigned to a Learning Support Programme at the beginning of term two. This paper gathers the results of our findings. It also proposes further improvements that can be made to the model.Keywords: continual assessment, predictive analytics, random forest, student psychological profile
Procedia PDF Downloads 13410345 Solving Weighted Number of Operation Plus Processing Time Due-Date Assignment, Weighted Scheduling and Process Planning Integration Problem Using Genetic and Simulated Annealing Search Methods
Authors: Halil Ibrahim Demir, Caner Erden, Mumtaz Ipek, Ozer Uygun
Abstract:
Traditionally, the three important manufacturing functions, which are process planning, scheduling and due-date assignment, are performed separately and sequentially. For couple of decades, hundreds of studies are done on integrated process planning and scheduling problems and numerous researches are performed on scheduling with due date assignment problem, but unfortunately the integration of these three important functions are not adequately addressed. Here, the integration of these three important functions is studied by using genetic, random-genetic hybrid, simulated annealing, random-simulated annealing hybrid and random search techniques. As well, the importance of the integration of these three functions and the power of meta-heuristics and of hybrid heuristics are studied.Keywords: process planning, weighted scheduling, weighted due-date assignment, genetic search, simulated annealing, hybrid meta-heuristics
Procedia PDF Downloads 46910344 SFO-ECRSEP: Sensor Field Optimızation Based Ecrsep For Heterogeneous WSNS
Authors: Gagandeep Singh
Abstract:
The sensor field optimization is a serious issue in WSNs and has been ignored by many researchers. As in numerous real-time sensing fields the sensor nodes on the corners i.e. on the segment boundaries will become lifeless early because no extraordinary safety is presented for them. Accordingly, in this research work the central objective is on the segment based optimization by separating the sensor field between advance and normal segments. The inspiration at the back this sensor field optimization is to extend the time spam when the first sensor node dies. For the reason that in normal sensor nodes which were exist on the borders may become lifeless early because the space among them and the base station is more so they consume more power so at last will become lifeless soon.Keywords: WSNs, ECRSEP, SEP, field optimization, energy
Procedia PDF Downloads 30010343 Influence of Vibration Amplitude on Reaction Time and Drowsiness Level
Authors: Mohd A. Azizan, Mohd Z. Zali
Abstract:
It is well established that exposure to vibration has an adverse effect on human health, comfort, and performance. However, there is little quantitative knowledge on performance combined with drowsiness level during vibration exposure. This paper reports a study investigating the influence of vibration amplitude on seated occupant reaction time and drowsiness level. Eighteen male volunteers were recruited for this experiment. Before commencing the experiment, total transmitted acceleration measured at interfaces between the seat pan and seatback to human body was adjusted to become 0.2 ms-2 r.m.s and 0.4 ms-2 r.m.s for each volunteer. Seated volunteers were exposed to Gaussian random vibration with frequency band 1-15 Hz at two level of amplitude (low vibration amplitude and medium vibration amplitude) for 20-minutes in separate days. For the purpose of drowsiness measurement, volunteers were asked to complete 10-minutes PVT test before and after vibration exposure and rate their subjective drowsiness by giving score using Karolinska Sleepiness Scale (KSS) before vibration, every 5-minutes interval and following 20-minutes of vibration exposure. Strong evidence of drowsiness was found as there was a significant increase in reaction time and number of lapse following exposure to vibration in both conditions. However, the effect is more apparent in medium vibration amplitude. A steady increase of drowsiness level can also be observed in KSS in all volunteers. However, no significant differences were found in KSS between low vibration amplitude and medium vibration amplitude. It is concluded that exposure to vibration has an adverse effect on human alertness level and more pronounced at higher vibration amplitude. Taken together, these findings suggest a role of vibration in promoting drowsiness, especially at higher vibration amplitude.Keywords: drowsiness, human vibration, karolinska sleepiness scale, psychomotor vigilance test
Procedia PDF Downloads 28210342 Localization of Near Field Radio Controlled Unintended Emitting Sources
Authors: Nurbanu Guzey, S. Jagannathan
Abstract:
Locating radio controlled (RC) devices using their unintended emissions has a great interest considering security concerns. Weak nature of these emissions requires near field localization approach since it is hard to detect these signals in far field region of array. Instead of only angle estimation, near field localization also requires range estimation of the source which makes this method more complicated than far field models. Challenges of locating such devices in a near field region and real time environment are analyzed in this paper. An ESPRIT like near field localization scheme is utilized for both angle and range estimation. 1-D search with symmetric subarrays is provided. Two 7 element uniform linear antenna arrays (ULA) are employed for locating RC source. Experiment results of location estimation for one unintended emitting walkie-talkie for different positions are given.Keywords: localization, angle of arrival (AoA), range estimation, array signal processing, ESPRIT, Uniform Linear Array (ULA)
Procedia PDF Downloads 52610341 Machine Learning-Driven Prediction of Cardiovascular Diseases: A Supervised Approach
Authors: Thota Sai Prakash, B. Yaswanth, Jhade Bhuvaneswar, Marreddy Divakar Reddy, Shyam Ji Gupta
Abstract:
Across the globe, there are a lot of chronic diseases, and heart disease stands out as one of the most perilous. Sadly, many lives are lost to this condition, even though early intervention could prevent such tragedies. However, identifying heart disease in its initial stages is not easy. To address this challenge, we propose an automated system aimed at predicting the presence of heart disease using advanced techniques. By doing so, we hope to empower individuals with the knowledge needed to take proactive measures against this potentially fatal illness. Our approach towards this problem involves meticulous data preprocessing and the development of predictive models utilizing classification algorithms such as Support Vector Machines (SVM), Decision Tree, and Random Forest. We assess the efficiency of every model based on metrics like accuracy, ensuring that we select the most reliable option. Additionally, we conduct thorough data analysis to reveal the importance of different attributes. Among the models considered, Random Forest emerges as the standout performer with an accuracy rate of 96.04% in our study.Keywords: support vector machines, decision tree, random forest
Procedia PDF Downloads 4010340 Performance Comparison of Cooperative Banks in the EU, USA and Canada
Authors: Matěj Kuc
Abstract:
This paper compares different types of profitability measures of cooperative banks from two developed regions: the European Union and the United States of America together with Canada. We created balanced dataset of more than 200 cooperative banks covering 2011-2016 period. We made series of tests and run Random Effects estimation on panel data. We found that American and Canadian cooperatives are more profitable in terms of return on assets (ROA) and return on equity (ROE). There is no significant difference in net interest margin (NIM). Our results show that the North American cooperative banks accommodated better to the current market environment.Keywords: cooperative banking, panel data, profitability measures, random effects
Procedia PDF Downloads 11310339 The Effect of the Crystal Field Interaction on the Critical Temperatures and the Sublattice Magnetizations of a Mixedspin-3/2 and Spin-5/2 Ferromagnetic System
Authors: Fathi Abubrig, Mohamed Delfag, Suad Abuzariba
Abstract:
The influence of the crystal field interactions on the mixed spin-3/2 and spin-5/2 ferromagnetic Ising system is considered by using the mean field theory based on Bogoliubov inequality for the Gibbs free energy. The ground-state phase diagram is constructed, the phase diagrams of the second-order critical temperatures are obtained, and the thermal variation of the sublattice magnetizations is investigated in detail. We find some interesting phenomena for the sublattice magnetizations at particular values of the crystal field interactions.Keywords: crystal field, Ising system, ferromagnetic, magnetization, phase diagrams
Procedia PDF Downloads 48610338 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier
Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh
Abstract:
This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems
Procedia PDF Downloads 4410337 A Sequential Approach for Random-Effects Meta-Analysis
Authors: Samson Henry Dogo, Allan Clark, Elena Kulinskaya
Abstract:
The objective in meta-analysis is to combine results from several independent studies in order to create generalization and provide evidence based for decision making. But recent studies show that the magnitude of effect size estimates reported in many areas of research finding changed with year publication and this can impair the results and conclusions of meta-analysis. A number of sequential methods have been proposed for monitoring the effect size estimates in meta-analysis. However they are based on statistical theory applicable to fixed effect model (FEM). For random-effects model (REM), the analysis incorporates the heterogeneity variance, tau-squared and its estimation create complications. In this paper proposed the use of Gombay and Serbian (2005) truncated CUSUM-type test with asymptotically valid critical values for sequential monitoring of REM. Simulation results show that the test does not control the Type I error well, and is not recommended. Further work required to derive an appropriate test in this important area of application.Keywords: meta-analysis, random-effects model, sequential test, temporal changes in effect sizes
Procedia PDF Downloads 46710336 Discontinuous Spacetime with Vacuum Holes as Explanation for Gravitation, Quantum Mechanics and Teleportation
Authors: Constantin Z. Leshan
Abstract:
Hole Vacuum theory is based on discontinuous spacetime that contains vacuum holes. Vacuum holes can explain gravitation, some laws of quantum mechanics and allow teleportation of matter. All massive bodies emit a flux of holes which curve the spacetime; if we increase the concentration of holes, it leads to length contraction and time dilation because the holes do not have the properties of extension and duration. In the limited case when space consists of holes only, the distance between every two points is equal to zero and time stops - outside of the Universe, the extension and duration properties do not exist. For this reason, the vacuum hole is the only particle in physics capable of describing gravitation using its own properties only. All microscopic particles must 'jump' continually and 'vibrate' due to the appearance of holes (impassable microscopic 'walls' in space), and it is the cause of the quantum behavior. Vacuum holes can explain the entanglement, non-locality, wave properties of matter, tunneling, uncertainty principle and so on. Particles do not have trajectories because spacetime is discontinuous and has impassable microscopic 'walls' due to the simple mechanical motion is impossible at small scale distances; it is impossible to 'trace' a straight line in the discontinuous spacetime because it contains the impassable holes. Spacetime 'boils' continually due to the appearance of the vacuum holes. For teleportation to be possible, we must send a body outside of the Universe by enveloping it with a closed surface consisting of vacuum holes. Since a material body cannot exist outside of the Universe, it reappears instantaneously in a random point of the Universe. Since a body disappears in one volume and reappears in another random volume without traversing the physical space between them, such a transportation method can be called teleportation (or Hole Teleportation). It is shown that Hole Teleportation does not violate causality and special relativity due to its random nature and other properties. Although Hole Teleportation has a random nature, it can be used for colonization of extrasolar planets by the help of the method called 'random jumps': after a large number of random teleportation jumps, there is a probability that the spaceship may appear near a habitable planet. We can create vacuum holes experimentally using the method proposed by Descartes: we must remove a body from the vessel without permitting another body to occupy this volume.Keywords: border of the Universe, causality violation, perfect isolation, quantum jumps
Procedia PDF Downloads 42510335 Development of a Very High Sensitivity Magnetic Field Sensor Based on Planar Hall Effect
Authors: Arnab Roy, P. S. Anil Kumar
Abstract:
Hall bar magnetic field sensors based on planar hall effect were fabricated from permalloy (Ni¬80Fe20) thin films grown by pulsed laser ablation. As large as 400% planar Hall voltage change was observed for a magnetic field sweep within ±4 Oe, a value comparable with present day TMR sensors at room temperature. A very large planar Hall sensitivity of 1200 Ω/T was measured close to switching fields, which was not obtained so far apart from 2DEG Hall sensors. In summary, a highly sensitive low magnetic field sensor has been constructed which has the added advantage of simple architecture, good signal to noise ratio and robustness.Keywords: planar hall effect, permalloy, NiFe, pulsed laser ablation, low magnetic field sensor, high sensitivity magnetic field sensor
Procedia PDF Downloads 51510334 Analysis of Process Methane Hydrate Formation That Include the Important Role of Deep-Sea Sediments with Analogy in Kerek Formation, Sub-Basin Kendeng, Central Java, Indonesia
Authors: Yan Bachtiar Muslih, Hangga Wijaya, Trio Fani, Putri Agustin
Abstract:
Demand of Energy in Indonesia always increases 5-6% a year, but production of conventional energy always decreases 3-5% a year, it means that conventional energy in 20-40 years ahead will not able to complete all energy demand in Indonesia, one of the solve way is using unconventional energy that is gas hydrate, gas hydrate is gas that form by biogenic process, gas hydrate stable in condition with extremely depth and low temperature, gas hydrate can form in two condition that is in pole condition and in deep-sea condition, wherein this research will focus in gas hydrate that association with methane form methane hydrate in deep-sea condition and usually form in depth between 150-2000 m, this research will focus in process of methane hydrate formation that is biogenic process and the important role of deep-sea sediment so can produce accumulation of methane hydrate, methane hydrate usually will be accumulated in find sediment in deep-sea environment with condition high-pressure and low-temperature this condition too usually make methane hydrate change into white nodule, methodology of this research is geology field work and laboratory analysis, from geology field work will get sample data consist of 10-15 samples from Kerek Formation outcrops as random for imagine the condition of deep-sea environment that influence the methane hydrate formation and also from geology field work will get data of measuring stratigraphy in outcrops Kerek Formation too from this data will help to imagine the process in deep-sea sediment like energy flow, supply sediment, and etc, and laboratory analysis is activity to analyze all data that get from geology field work, the result of this research can used to exploration activity of methane hydrate in another prospect deep-sea environment in Indonesia.Keywords: methane hydrate, deep-sea sediment, kerek formation, sub-basin of kendeng, central java, Indonesia
Procedia PDF Downloads 46210333 Application of Random Forest Model in The Prediction of River Water Quality
Authors: Turuganti Venkateswarlu, Jagadeesh Anmala
Abstract:
Excessive runoffs from various non-point source land uses, and other point sources are rapidly contaminating the water quality of streams in the Upper Green River watershed, Kentucky, USA. It is essential to maintain the stream water quality as the river basin is one of the major freshwater sources in this province. It is also important to understand the water quality parameters (WQPs) quantitatively and qualitatively along with their important features as stream water is sensitive to climatic events and land-use practices. In this paper, a model was developed for predicting one of the significant WQPs, Fecal Coliform (FC) from precipitation, temperature, urban land use factor (ULUF), agricultural land use factor (ALUF), and forest land-use factor (FLUF) using Random Forest (RF) algorithm. The RF model, a novel ensemble learning algorithm, can even find out advanced feature importance characteristics from the given model inputs for different combinations. This model’s outcomes showed a good correlation between FC and climate events and land use factors (R2 = 0.94) and precipitation and temperature are the primary influencing factors for FC.Keywords: water quality, land use factors, random forest, fecal coliform
Procedia PDF Downloads 19710332 Polypropylene Matrix Enriched With Silver Nanoparticles From Banana Peel Extract For Antimicrobial Control Of E. coli and S. epidermidis To Maintain Fresh Food
Authors: Michail Milas, Aikaterini Dafni Tegiou, Nickolas Rigopoulos, Eustathios Giaouris, Zaharias Loannou
Abstract:
Nanotechnology, a relatively new scientific field, addresses the manipulation of nanoscale materials and devices, which are governed by unique properties, and is applied in a wide range of industries, including food packaging. The incorporation of nanoparticles into polymer matrices used for food packaging is a field that is highly researched today. One such combination is silver nanoparticles with polypropylene. In the present study, the synthesis of the silver nanoparticles was carried out by a natural method. In particular, a ripe banana peel extract was used. This method is superior to others as it stands out for its environmental friendliness, high efficiency and low-cost requirement. In particular, a 1.75 mM AgNO₃ silver nitrate solution was used, as well as a BPE concentration of 1.7% v/v, an incubation period of 48 hours at 70°C and a pH of 4.3 and after its preparation, the polypropylene films were soaked in it. For the PP films, random PP spheres were melted at 170-190°C into molds with 0.8cm diameter. This polymer was chosen as it is suitable for plastic parts and reusable plastic containers of various types that are intended to come into contact with food without compromising its quality and safety. The antimicrobial test against Escherichia coli DFSNB1 and Staphylococcus epidermidis DFSNB4 was performed on the films. It appeared that the films with silver nanoparticles had a reduction, at least 100 times, compared to those without silver nanoparticles, in both strains. The limit of detection is the lower limit of the vertical error lines in the presence of nanoparticles, which is 3.11. The main reasons that led to the adsorption of nanoparticles are the porous nature of polypropylene and the adsorption capacity of nanoparticles on the surface of the films due to hydrophobic-hydrophilic forces. The most significant parameters that contributed to the results of the experiment include the following: the stage of ripening of the banana during the preparation of the plant extract, the temperature and residence time of the nanoparticle solution in the oven, the residence time of the polypropylene films in the nanoparticle solution, the number of nanoparticles inoculated on the films and, finally, the time these stayed in the refrigerator so that they could dry and be ready for antimicrobial treatment.Keywords: antimicrobial control, banana peel extract, E. coli, natural synthesis, microbe, plant extract, polypropylene films, S.epidermidis, silver nano, random pp
Procedia PDF Downloads 17610331 Secure Watermarking not at the Cost of Low Robustness
Authors: Jian Cao
Abstract:
This paper describes a novel watermarking technique which we call the random direction embedding (RDE) watermarking. Unlike traditional watermarking techniques, the watermark energy after the RDE embedding does not focus on a fixed direction, leading to the security against the traditional unauthorized watermark removal attack. In addition, the experimental results show that when compared with the existing secure watermarking, namely natural watermarking (NW), the RDE watermarking gains significant improvement in terms of robustness. In fact, the security of the RDE watermarking is not at the cost of low robustness, and it can even achieve more robust than the traditional spread spectrum watermarking, which has been shown to be very insecure.Keywords: robustness, spread spectrum watermarking, watermarking security, random direction embedding (RDE)
Procedia PDF Downloads 38510330 A Study of Non Linear Partial Differential Equation with Random Initial Condition
Authors: Ayaz Ahmad
Abstract:
In this work, we present the effect of noise on the solution of a partial differential equation (PDE) in three different setting. We shall first consider random initial condition for two nonlinear dispersive PDE the non linear Schrodinger equation and the Kortteweg –de vries equation and analyse their effect on some special solution , the soliton solutions.The second case considered a linear partial differential equation , the wave equation with random initial conditions allow to substantially decrease the computational and data storage costs of an algorithm to solve the inverse problem based on the boundary measurements of the solution of this equation. Finally, the third example considered is that of the linear transport equation with a singular drift term, when we shall show that the addition of a multiplicative noise term forbids the blow up of solutions under a very weak hypothesis for which we have finite time blow up of a solution in the deterministic case. Here we consider the problem of wave propagation, which is modelled by a nonlinear dispersive equation with noisy initial condition .As observed noise can also be introduced directly in the equations.Keywords: drift term, finite time blow up, inverse problem, soliton solution
Procedia PDF Downloads 21510329 Combination of Geological, Geophysical and Reservoir Engineering Analyses in Field Development: A Case Study
Authors: Atif Zafar, Fan Haijun
Abstract:
A sequence of different Reservoir Engineering methods and tools in reservoir characterization and field development are presented in this paper. The real data of Jin Gas Field of L-Basin of Pakistan is used. The basic concept behind this work is to enlighten the importance of well test analysis in a broader way (i.e. reservoir characterization and field development) unlike to just determine the permeability and skin parameters. Normally in the case of reservoir characterization we rely on well test analysis to some extent but for field development plan, the well test analysis has become a forgotten tool specifically for locations of new development wells. This paper describes the successful implementation of well test analysis in Jin Gas Field where the main uncertainties are identified during initial stage of field development when location of new development well was marked only on the basis of G&G (Geologic and Geophysical) data. The seismic interpretation could not encounter one of the boundary (fault, sub-seismic fault, heterogeneity) near the main and only producing well of Jin Gas Field whereas the results of the model from the well test analysis played a very crucial rule in order to propose the location of second well of the newly discovered field. The results from different methods of well test analysis of Jin Gas Field are also integrated with and supported by other tools of Reservoir Engineering i.e. Material Balance Method and Volumetric Method. In this way, a comprehensive way out and algorithm is obtained in order to integrate the well test analyses with Geological and Geophysical analyses for reservoir characterization and field development. On the strong basis of this working and algorithm, it was successfully evaluated that the proposed location of new development well was not justified and it must be somewhere else except South direction.Keywords: field development plan, reservoir characterization, reservoir engineering, well test analysis
Procedia PDF Downloads 36410328 Efficient Signcryption Scheme with Provable Security for Smart Card
Authors: Jayaprakash Kar, Daniyal M. Alghazzawi
Abstract:
The article proposes a novel construction of signcryption scheme with provable security which is most suited to implement on smart card. It is secure in random oracle model and the security relies on Decisional Bilinear Diffie-Hellmann Problem. The proposed scheme is secure against adaptive chosen ciphertext attack (indistiguishbility) and adaptive chosen message attack (unforgebility). Also, it is inspired by zero-knowledge proof. The two most important security goals for smart card are Confidentiality and authenticity. These functions are performed in one logical step in low computational cost.Keywords: random oracle, provable security, unforgebility, smart card
Procedia PDF Downloads 59310327 Improving Chest X-Ray Disease Detection with Enhanced Data Augmentation Using Novel Approach of Diverse Conditional Wasserstein Generative Adversarial Networks
Authors: Malik Muhammad Arslan, Muneeb Ullah, Dai Shihan, Daniyal Haider, Xiaodong Yang
Abstract:
Chest X-rays are instrumental in the detection and monitoring of a wide array of diseases, including viral infections such as COVID-19, tuberculosis, pneumonia, lung cancer, and various cardiac and pulmonary conditions. To enhance the accuracy of diagnosis, artificial intelligence (AI) algorithms, particularly deep learning models like Convolutional Neural Networks (CNNs), are employed. However, these deep learning models demand a substantial and varied dataset to attain optimal precision. Generative Adversarial Networks (GANs) can be employed to create new data, thereby supplementing the existing dataset and enhancing the accuracy of deep learning models. Nevertheless, GANs have their limitations, such as issues related to stability, convergence, and the ability to distinguish between authentic and fabricated data. In order to overcome these challenges and advance the detection and classification of CXR normal and abnormal images, this study introduces a distinctive technique known as DCWGAN (Diverse Conditional Wasserstein GAN) for generating synthetic chest X-ray (CXR) images. The study evaluates the effectiveness of this Idiosyncratic DCWGAN technique using the ResNet50 model and compares its results with those obtained using the traditional GAN approach. The findings reveal that the ResNet50 model trained on the DCWGAN-generated dataset outperformed the model trained on the classic GAN-generated dataset. Specifically, the ResNet50 model utilizing DCWGAN synthetic images achieved impressive performance metrics with an accuracy of 0.961, precision of 0.955, recall of 0.970, and F1-Measure of 0.963. These results indicate the promising potential for the early detection of diseases in CXR images using this Inimitable approach.Keywords: CNN, classification, deep learning, GAN, Resnet50
Procedia PDF Downloads 8810326 Comparison of FASTMAP and B0 Field Map Shimming for 4T MRI
Authors: Mohan L. Jayatiake, Judd Storrs, Jing-Huei Lee
Abstract:
The optimal MRI resolution relies on a homogeneous magnetic field. However, local susceptibility variations can lead to field inhomogeneities that cause artifacts such as image distortion and signal loss. The effects of local susceptibility variation notoriously increase with magnetic field strength. Active shimming improves homogeneity by applying corrective fields generated from shim coils, but requires calculation of optimal current for each shim coil. FASTMAP (fast automatic shimming technique by mapping along projections) is an effective technique for finding optimal currents works well at high-field, but is restricted to shimming spherical regions of interest. The 3D gradient-echo pulse sequence was modified to reduce sensitivity to eddy currents and used to obtain susceptibility field maps at 4T. Measured fields were projected onto first-and second-order spherical harmonic functions corresponding to shim hardware. A spherical phantom was used to calibrate the shim currents. Susceptibility maps of a volunteer’s brain with and without FASTMAP shimming were obtained. Simulations indicate that optimal shim currents derived from the field map may provide better overall shimming of the human brain.Keywords: shimming, high-field, active, passive
Procedia PDF Downloads 50910325 Optimization of Reliability and Communicability of a Random Two-Dimensional Point Patterns Using Delaunay Triangulation
Authors: Sopheak Sorn, Kwok Yip Szeto
Abstract:
Reliability is one of the important measures of how well the system meets its design objective, and mathematically is the probability that a complex system will perform satisfactorily. When the system is described by a network of N components (nodes) and their L connection (links), the reliability of the system becomes a network design problem that is an NP-hard combinatorial optimization problem. In this paper, we address the network design problem for a random point set’s pattern in two dimensions. We make use of a Voronoi construction with each cell containing exactly one point in the point pattern and compute the reliability of the Voronoi’s dual, i.e. the Delaunay graph. We further investigate the communicability of the Delaunay network. We find that there is a positive correlation and a negative correlation between the homogeneity of a Delaunay's degree distribution with its reliability and its communicability respectively. Based on the correlations, we alter the communicability and the reliability by performing random edge flips, which preserve the number of links and nodes in the network but can increase the communicability in a Delaunay network at the cost of its reliability. This transformation is later used to optimize a Delaunay network with the optimum geometric mean between communicability and reliability. We also discuss the importance of the edge flips in the evolution of real soap froth in two dimensions.Keywords: Communicability, Delaunay triangulation, Edge Flip, Reliability, Two dimensional network, Voronio
Procedia PDF Downloads 41910324 Prediction Fluid Properties of Iranian Oil Field with Using of Radial Based Neural Network
Authors: Abdolreza Memari
Abstract:
In this article in order to estimate the viscosity of crude oil,a numerical method has been used. We use this method to measure the crude oil's viscosity for 3 states: Saturated oil's viscosity, viscosity above the bubble point and viscosity under the saturation pressure. Then the crude oil's viscosity is estimated by using KHAN model and roller ball method. After that using these data that include efficient conditions in measuring viscosity, the estimated viscosity by the presented method, a radial based neural method, is taught. This network is a kind of two layered artificial neural network that its stimulation function of hidden layer is Gaussian function and teaching algorithms are used to teach them. After teaching radial based neural network, results of experimental method and artificial intelligence are compared all together. Teaching this network, we are able to estimate crude oil's viscosity without using KHAN model and experimental conditions and under any other condition with acceptable accuracy. Results show that radial neural network has high capability of estimating crude oil saving in time and cost is another advantage of this investigation.Keywords: viscosity, Iranian crude oil, radial based, neural network, roller ball method, KHAN model
Procedia PDF Downloads 50110323 Experimental Study on Improving the Engineering Properties of Sand Dunes Using Random Fibers-Geogrid Reinforcement
Authors: Adel M. Belal, Sameh Abu El-Soud, Mariam Farid
Abstract:
This study presents the effect of reinforcement inclusions (fibers-geogrids) on fine sand bearing capacity under strip footings. Experimental model tests were carried out using a rectangular plates [(10cm x 38 cm), (7.5 cm x 38 cm), and (12.5 cm x 38 cm)] with a geogrids and randomly reinforced fibers. The width and depth of the geogrid were varied to determine their effects on the engineering properties of treated poorly graded fine sand. Laboratory model test results for the ultimate stresses and the settlement of a rigid strip foundation supported by single and multi-layered fiber-geogrid-reinforced sand are presented. The number of layers of geogrid was varied between 1 to 4. The effect of the first geogrid reinforcement depth, the spacing between the reinforcement and its length on the bearing capacity is investigated by experimental program. Results show that the use of flexible random fibers with a content of 0.125% by weight of the treated sand dunes, with 3 geogrid reinforcement layers, u/B= 0.25 and L/B=7.5, has a significant increase in the bearing capacity of the proposed system.Keywords: earth reinforcement, geogrid, random fiber, reinforced soil
Procedia PDF Downloads 31210322 Comparison of Multivariate Adaptive Regression Splines and Random Forest Regression in Predicting Forced Expiratory Volume in One Second
Authors: P. V. Pramila , V. Mahesh
Abstract:
Pulmonary Function Tests are important non-invasive diagnostic tests to assess respiratory impairments and provides quantifiable measures of lung function. Spirometry is the most frequently used measure of lung function and plays an essential role in the diagnosis and management of pulmonary diseases. However, the test requires considerable patient effort and cooperation, markedly related to the age of patients esulting in incomplete data sets. This paper presents, a nonlinear model built using Multivariate adaptive regression splines and Random forest regression model to predict the missing spirometric features. Random forest based feature selection is used to enhance both the generalization capability and the model interpretability. In the present study, flow-volume data are recorded for N= 198 subjects. The ranked order of feature importance index calculated by the random forests model shows that the spirometric features FVC, FEF 25, PEF,FEF 25-75, FEF50, and the demographic parameter height are the important descriptors. A comparison of performance assessment of both models prove that, the prediction ability of MARS with the `top two ranked features namely the FVC and FEF 25 is higher, yielding a model fit of R2= 0.96 and R2= 0.99 for normal and abnormal subjects. The Root Mean Square Error analysis of the RF model and the MARS model also shows that the latter is capable of predicting the missing values of FEV1 with a notably lower error value of 0.0191 (normal subjects) and 0.0106 (abnormal subjects). It is concluded that combining feature selection with a prediction model provides a minimum subset of predominant features to train the model, yielding better prediction performance. This analysis can assist clinicians with a intelligence support system in the medical diagnosis and improvement of clinical care.Keywords: FEV, multivariate adaptive regression splines pulmonary function test, random forest
Procedia PDF Downloads 31010321 Magnetic Navigation of Nanoparticles inside a 3D Carotid Model
Authors: E. G. Karvelas, C. Liosis, A. Theodorakakos, T. E. Karakasidis
Abstract:
Magnetic navigation of the drug inside the human vessels is a very important concept since the drug is delivered to the desired area. Consequently, the quantity of the drug required to reach therapeutic levels is being reduced while the drug concentration at targeted sites is increased. Magnetic navigation of drug agents can be achieved with the use of magnetic nanoparticles where anti-tumor agents are loaded on the surface of the nanoparticles. The magnetic field that is required to navigate the particles inside the human arteries is produced by a magnetic resonance imaging (MRI) device. The main factors which influence the efficiency of the usage of magnetic nanoparticles for biomedical applications in magnetic driving are the size and the magnetization of the biocompatible nanoparticles. In this study, a computational platform for the simulation of the optimal gradient magnetic fields for the navigation of magnetic nanoparticles inside a carotid artery is presented. For the propulsion model of the particles, seven major forces are considered, i.e., the magnetic force from MRIs main magnet static field as well as the magnetic field gradient force from the special propulsion gradient coils. The static field is responsible for the aggregation of nanoparticles, while the magnetic gradient contributes to the navigation of the agglomerates that are formed. Moreover, the contact forces among the aggregated nanoparticles and the wall and the Stokes drag force for each particle are considered, while only spherical particles are used in this study. In addition, gravitational forces due to gravity and the force due to buoyancy are included. Finally, Van der Walls force and Brownian motion are taken into account in the simulation. The OpenFoam platform is used for the calculation of the flow field and the uncoupled equations of particles' motion. To verify the optimal gradient magnetic fields, a covariance matrix adaptation evolution strategy (CMAES) is used in order to navigate the particles into the desired area. A desired trajectory is inserted into the computational geometry, which the particles are going to be navigated in. Initially, the CMAES optimization strategy provides the OpenFOAM program with random values of the gradient magnetic field. At the end of each simulation, the computational platform evaluates the distance between the particles and the desired trajectory. The present model can simulate the motion of particles when they are navigated by the magnetic field that is produced by the MRI device. Under the influence of fluid flow, the model investigates the effect of different gradient magnetic fields in order to minimize the distance of particles from the desired trajectory. In addition, the platform can navigate the particles into the desired trajectory with an efficiency between 80-90%. On the other hand, a small number of particles are stuck to the walls and remains there for the rest of the simulation.Keywords: artery, drug, nanoparticles, navigation
Procedia PDF Downloads 10710320 An Investigation on Electric Field Distribution around 380 kV Transmission Line for Various Pylon Models
Authors: C. F. Kumru, C. Kocatepe, O. Arikan
Abstract:
In this study, electric field distribution analyses for three pylon models are carried out by a Finite Element Method (FEM) based software. Analyses are performed in both stationary and time domains to observe instantaneous values along with the effective ones. Considering the results of the study, different line geometries is considerably affecting the magnitude and distribution of electric field although the line voltages are the same. Furthermore, it is observed that maximum values of instantaneous electric field obtained in time domain analysis are quite higher than the effective ones in stationary mode. In consequence, electric field distribution analyses should be individually made for each different line model and the limit exposure values or distances to residential buildings should be defined according to the results obtained.Keywords: electric field, energy transmission line, finite element method, pylon
Procedia PDF Downloads 728