Search results for: random flows
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2805

Search results for: random flows

2625 Designing Inventory System with Constrained by Reducing Ordering Cost, Lead Time and Lost Sale Rate and Considering Random Disturbance in Ordering Quantity

Authors: Arezoo Heidary, Abolfazl Mirzazadeh, Aref Gholami-Qadikolaei

Abstract:

In the business environment it is very common that a lot received may not be equal to quantity ordered. in this work, a random disturbance in a received quantity is considered. It is assumed a maximum allowable limit for storage space and inventory investment.The impact of lead time and ordering cost reductions once they act dependently is also investigated. Further, considering a mixture of back order and lost sales for allowable shortage system, the effect of investment on reducing lost sale rate is analyzed. For the proposed control system, a Lagrangian method is applied in order to solve the problem and an algorithmic procedure is utilized to achieve optimal solution with the global minimum expected cost. Finally, proves on concavity and convexity of the model in the decision variables are shown.

Keywords: stochastic inventory system, lead time, ordering cost, lost sale rate, inventory constraints, random disturbance

Procedia PDF Downloads 390
2624 Relevant LMA Features for Human Motion Recognition

Authors: Insaf Ajili, Malik Mallem, Jean-Yves Didier

Abstract:

Motion recognition from videos is actually a very complex task due to the high variability of motions. This paper describes the challenges of human motion recognition, especially motion representation step with relevant features. Our descriptor vector is inspired from Laban Movement Analysis method. We propose discriminative features using the Random Forest algorithm in order to remove redundant features and make learning algorithms operate faster and more effectively. We validate our method on MSRC-12 and UTKinect datasets.

Keywords: discriminative LMA features, features reduction, human motion recognition, random forest

Procedia PDF Downloads 162
2623 The Use of Random Set Method in Reliability Analysis of Deep Excavations

Authors: Arefeh Arabaninezhad, Ali Fakher

Abstract:

Since the deterministic analysis methods fail to take system uncertainties into account, probabilistic and non-probabilistic methods are suggested. Geotechnical analyses are used to determine the stress and deformation caused by construction; accordingly, many input variables which depend on ground behavior are required for geotechnical analyses. The Random Set approach is an applicable reliability analysis method when comprehensive sources of information are not available. Using Random Set method, with relatively small number of simulations compared to fully probabilistic methods, smooth extremes on system responses are obtained. Therefore random set approach has been proposed for reliability analysis in geotechnical problems. In the present study, the application of random set method in reliability analysis of deep excavations is investigated through three deep excavation projects which were monitored during the excavating process. A finite element code is utilized for numerical modeling. Two expected ranges, from different sources of information, are established for each input variable, and a specific probability assignment is defined for each range. To determine the most influential input variables and subsequently reducing the number of required finite element calculations, sensitivity analysis is carried out. Input data for finite element model are obtained by combining the upper and lower bounds of the input variables. The relevant probability share of each finite element calculation is determined considering the probability assigned to input variables present in these combinations. Horizontal displacement of the top point of excavation is considered as the main response of the system. The result of reliability analysis for each intended deep excavation is presented by constructing the Belief and Plausibility distribution function (i.e. lower and upper bounds) of system response obtained from deterministic finite element calculations. To evaluate the quality of input variables as well as applied reliability analysis method, the range of displacements extracted from models has been compared to the in situ measurements and good agreement is observed. The comparison also showed that Random Set Finite Element Method applies to estimate the horizontal displacement of the top point of deep excavation. Finally, the probability of failure or unsatisfactory performance of the system is evaluated by comparing the threshold displacement with reliability analysis results.

Keywords: deep excavation, random set finite element method, reliability analysis, uncertainty

Procedia PDF Downloads 243
2622 A Comprehensive Analysis of the Phylogenetic Signal in Ramp Sequences in 211 Vertebrates

Authors: Lauren M. McKinnon, Justin B. Miller, Michael F. Whiting, John S. K. Kauwe, Perry G. Ridge

Abstract:

Background: Ramp sequences increase translational speed and accuracy when rare, slowly-translated codons are found at the beginnings of genes. Here, the results of the first analysis of ramp sequences in a phylogenetic construct are presented. Methods: Ramp sequences were compared from 211 vertebrates (110 Mammalian and 101 non-mammalian). The presence and absence of ramp sequences were analyzed as a binary character in a parsimony and maximum likelihood framework. Additionally, ramp sequences were mapped to the Open Tree of Life taxonomy to determine the number of parallelisms and reversals that occurred, and these results were compared to what would be expected due to random chance. Lastly, aligned nucleotides in ramp sequences were compared to the rest of the sequence in order to examine possible differences in phylogenetic signal between these regions of the gene. Results: Parsimony and maximum likelihood analyses of the presence/absence of ramp sequences recovered phylogenies that are highly congruent with established phylogenies. Additionally, the retention index of ramp sequences is significantly higher than would be expected due to random chance (p-value = 0). A chi-square analysis of completely orthologous ramp sequences resulted in a p-value of approximately zero as compared to random chance. Discussion: Ramp sequences recover comparable phylogenies as other phylogenomic methods. Although not all ramp sequences appear to have a phylogenetic signal, more ramp sequences track speciation than expected by random chance. Therefore, ramp sequences may be used in conjunction with other phylogenomic approaches.

Keywords: codon usage bias, phylogenetics, phylogenomics, ramp sequence

Procedia PDF Downloads 129
2621 Simulations of Laminar Liquid Flows through Superhydrophobic Micro-Pipes

Authors: Mohamed E. Eleshaky

Abstract:

This paper investigates the dynamic behavior of laminar water flows inside superhydrophobic micro-pipes patterned with square micro-posts features under different operating conditions. It also investigates the effects of air fraction and Reynolds number on the frictional performance of these pipes. Rather than modeling the air-water interfaces of superhydrophobic as a flat inflexible surface, a transient, incompressible, three-dimensional, volume-of-fluid (VOF) methodology has been employed to continuously track the air–water interface shape inside micro-pipes. Also, the entrance effects on the flow field have been taken into consideration. The results revealed the strong dependency of the frictional performance on the air fractions and Reynolds number. The frictional resistance reduction becomes increasingly more significant at large air fractions and low Reynolds numbers. Increasing Reynolds number has an adverse effect on the frictional resistance reduction.

Keywords: drag reduction, laminar flow in micropipes, numerical simulation, superhyrophobic surfaces, microposts

Procedia PDF Downloads 301
2620 Geo-Additive Modeling of Family Size in Nigeria

Authors: Oluwayemisi O. Alaba, John O. Olaomi

Abstract:

The 2013 Nigerian Demographic Health Survey (NDHS) data was used to investigate the determinants of family size in Nigeria using the geo-additive model. The fixed effect of categorical covariates were modelled using the diffuse prior, P-spline with second-order random walk for the nonlinear effect of continuous variable, spatial effects followed Markov random field priors while the exchangeable normal priors were used for the random effects of the community and household. The Negative Binomial distribution was used to handle overdispersion of the dependent variable. Inference was fully Bayesian approach. Results showed a declining effect of secondary and higher education of mother, Yoruba tribe, Christianity, family planning, mother giving birth by caesarean section and having a partner who has secondary education on family size. Big family size is positively associated with age at first birth, number of daughters in a household, being gainfully employed, married and living with partner, community and household effects.

Keywords: Bayesian analysis, family size, geo-additive model, negative binomial

Procedia PDF Downloads 512
2619 Regularized Euler Equations for Incompressible Two-Phase Flow Simulations

Authors: Teng Li, Kamran Mohseni

Abstract:

This paper presents an inviscid regularization technique for the incompressible two-phase flow simulations. This technique is known as observable method due to the understanding of observability that any feature smaller than the actual resolution (physical or numerical), i.e., the size of wire in hotwire anemometry or the grid size in numerical simulations, is not able to be captured or observed. Differ from most regularization techniques that applies on the numerical discretization, the observable method is employed at PDE level during the derivation of equations. Difficulties in the simulation and analysis of realistic fluid flow often result from discontinuities (or near-discontinuities) in the calculated fluid properties or state. Accurately capturing these discontinuities is especially crucial when simulating flows involving shocks, turbulence or sharp interfaces. Over the past several years, the properties of this new regularization technique have been investigated that show the capability of simultaneously regularizing shocks and turbulence. The observable method has been performed on the direct numerical simulations of shocks and turbulence where the discontinuities are successfully regularized and flow features are well captured. In the current paper, the observable method will be extended to two-phase interfacial flows. Multiphase flows share the similar features with shocks and turbulence that is the nonlinear irregularity caused by the nonlinear terms in the governing equations, namely, Euler equations. In the direct numerical simulation of two-phase flows, the interfaces are usually treated as the smooth transition of the properties from one fluid phase to the other. However, in high Reynolds number or low viscosity flows, the nonlinear terms will generate smaller scales which will sharpen the interface, causing discontinuities. Many numerical methods for two-phase flows fail at high Reynolds number case while some others depend on the numerical diffusion from spatial discretization. The observable method regularizes this nonlinear mechanism by filtering the convective terms and this process is inviscid. The filtering effect is controlled by an observable scale which is usually about a grid length. Single rising bubble and Rayleigh-Taylor instability are studied, in particular, to examine the performance of the observable method. A pseudo-spectral method is used for spatial discretization which will not introduce numerical diffusion, and a Total Variation Diminishing (TVD) Runge Kutta method is applied for time integration. The observable incompressible Euler equations are solved for these two problems. In rising bubble problem, the terminal velocity and shape of the bubble are particularly examined and compared with experiments and other numerical results. In the Rayleigh-Taylor instability, the shape of the interface are studied for different observable scale and the spike and bubble velocities, as well as positions (under a proper observable scale), are compared with other simulation results. The results indicate that this regularization technique can potentially regularize the sharp interface in the two-phase flow simulations

Keywords: Euler equations, incompressible flow simulation, inviscid regularization technique, two-phase flow

Procedia PDF Downloads 472
2618 Test-Retest Agreement, Random Measurement Error and Practice Effect of the Continuous Performance Test-Identical Pairs for Patients with Schizophrenia

Authors: Kuan-Wei Chen, Chien-Wei Chen, Tai-Ling Chang, Nan-Cheng Chen, Ching-Lin Hsieh, Gong-Hong Lin

Abstract:

Background and Purposes: Deficits in sustained attention are common in patients with schizophrenia. Such impairment can limit patients to effectively execute daily activities and affect the efficacy of rehabilitation. The aims of this study were to examine the test-retest agreement, random measurement error, and practice effect of the Continuous Performance Test-Identical Pairs (CPT-IP) (a commonly used sustained attention test) in patients with schizophrenia. The results can provide empirical evidence for clinicians and researchers to apply a sustained attention test with sound psychometric properties in schizophrenia patients. Methods: We recruited patients with chronic schizophrenia to be assessed twice with 1 week interval using CPT-IP. The intra-class correlation coefficient (ICC) was used to examine the test-retest agreement. The percentage of minimal detectable change (MDC%) was used to examine the random measurement error. Moreover, the standardized response mean (SRM) was used to examine the practice effect. Results: A total of 56 patients participated in this study. Our results showed that the ICC was 0.82, MDC% was 47.4%, and SRMs were 0.36 for the CPT-IP. Conclusion: Our results indicate that CPT-IP has acceptable test-retests agreement, substantial random measurement error, and small practice effect in patients with schizophrenia. Therefore, to avoid overestimating patients’ changes in sustained attention, we suggest that clinicians interpret the change scores of CPT-IP conservatively in their routine repeated assessments.

Keywords: schizophrenia, sustained attention, CPT-IP, reliability

Procedia PDF Downloads 272
2617 Particle Jetting Induced by the Explosive Dispersal

Authors: Kun Xue, Lvlan Miu, Jiarui Li

Abstract:

Jetting structures are widely found in particle rings or shells dispersed by the central explosion. In contrast, some explosive dispersal of particles only results in a dispersed cloud without distinctive structures. Employing the coupling method of the compressible computational fluid mechanics and discrete element method (CCFD-DEM), we reveal the underlying physics governing the formation of the jetting structure, which is related to the competition between the shock compaction and gas infiltration, two major processes during the shock interaction with the granular media. If the shock compaction exceeds the gas infiltration, the discernable jetting structures are expected, precipitated by the agglomerates of fast-moving particles induced by the heterogenous network of force chains. Otherwise, particles are uniformly accelerated by the interstitial flows, and no distinguishable jetting structures are formed. We proceed to devise the phase map of the jetting formation in the space defined by two dimensionless parameters which characterize the timescales of the shock compaction and the gas infiltration, respectively.

Keywords: compressible multiphase flows, DEM, granular jetting, pattern formation

Procedia PDF Downloads 46
2616 Studying Roughness Effects on Flow Regimes in Offshore Pipelines

Authors: Mohammad Sadegh Narges, Zahra Ghadampour

Abstract:

Due to the specific condition, offshore pipelines are given careful consideration and care in both design and operation. Most of the offshore pipeline flows are multi-phase. Multi-phase flows construct different pattern or flow regimes (in simultaneous gas-liquid flow, flow regimes like slug flow, wave and …) under different circumstances. One of the influencing factors on the flow regime is the pipeline roughness value. So far, roughness value influences and the sensitivity of the present models to this parameter have not been taken into consideration. Therefore, roughness value influences on the flow regimes in offshore pipelines are discussed in this paper. Results showed that geometry, absolute pipeline roughness value (materials that the pipeline is made of) and flow phases prevailing the system are of the influential parameters on the flow regimes prevailing multi-phase pipelines in a way that a change in any of these parameters results in a change in flow regimes in all or part of the pipeline system.

Keywords: absolute roughness, flow regime, multi-phase flow, offshore pipelines

Procedia PDF Downloads 338
2615 Deterministic Random Number Generator Algorithm for Cryptosystem Keys

Authors: Adi A. Maaita, Hamza A. A. Al Sewadi

Abstract:

One of the crucial parameters of digital cryptographic systems is the selection of the keys used and their distribution. The randomness of the keys has a strong impact on the system’s security strength being difficult to be predicted, guessed, reproduced or discovered by a cryptanalyst. Therefore, adequate key randomness generation is still sought for the benefit of stronger cryptosystems. This paper suggests an algorithm designed to generate and test pseudo random number sequences intended for cryptographic applications. This algorithm is based on mathematically manipulating a publically agreed upon information between sender and receiver over a public channel. This information is used as a seed for performing some mathematical functions in order to generate a sequence of pseudorandom numbers that will be used for encryption/decryption purposes. This manipulation involves permutations and substitutions that fulfills Shannon’s principle of “confusion and diffusion”. ASCII code characters wereutilized in the generation process instead of using bit strings initially, which adds more flexibility in testing different seed values. Finally, the obtained results would indicate sound difficulty of guessing keys by attackers.

Keywords: cryptosystems, information security agreement, key distribution, random numbers

Procedia PDF Downloads 238
2614 Computational Fluid Dynamics Simulations and Analysis of Air Bubble Rising in a Column of Liquid

Authors: Baha-Aldeen S. Algmati, Ahmed R. Ballil

Abstract:

Multiphase flows occur widely in many engineering and industrial processes as well as in the environment we live in. In particular, bubbly flows are considered to be crucial phenomena in fluid flow applications and can be studied and analyzed experimentally, analytically, and computationally. In the present paper, the dynamic motion of an air bubble rising within a column of liquid is numerically simulated using an open-source CFD modeling tool 'OpenFOAM'. An interface tracking numerical algorithm called MULES algorithm, which is built-in OpenFOAM, is chosen to solve an appropriate mathematical model based on the volume of fluid (VOF) numerical method. The bubbles initially have a spherical shape and starting from rest in the stagnant column of liquid. The algorithm is initially verified against numerical results and is also validated against available experimental data. The comparison revealed that this algorithm provides results that are in a very good agreement with the 2D numerical data of other CFD codes. Also, the results of the bubble shape and terminal velocity obtained from the 3D numerical simulation showed a very good qualitative and quantitative agreement with the experimental data. The simulated rising bubbles yield a very small percentage of error in the bubble terminal velocity compared with the experimental data. The obtained results prove the capability of OpenFOAM as a powerful tool to predict the behavior of rising characteristics of the spherical bubbles in the stagnant column of liquid. This will pave the way for a deeper understanding of the phenomenon of the rise of bubbles in liquids.

Keywords: CFD simulations, multiphase flows, OpenFOAM, rise of bubble, volume of fluid method, VOF

Procedia PDF Downloads 97
2613 Using Predictive Analytics to Identify First-Year Engineering Students at Risk of Failing

Authors: Beng Yew Low, Cher Liang Cha, Cheng Yong Teoh

Abstract:

Due to a lack of continual assessment or grade related data, identifying first-year engineering students in a polytechnic education at risk of failing is challenging. Our experience over the years tells us that there is no strong correlation between having good entry grades in Mathematics and the Sciences and excelling in hardcore engineering subjects. Hence, identifying students at risk of failure cannot be on the basis of entry grades in Mathematics and the Sciences alone. These factors compound the difficulty of early identification and intervention. This paper describes the development of a predictive analytics model in the early detection of students at risk of failing and evaluates its effectiveness. Data from continual assessments conducted in term one, supplemented by data of student psychological profiles such as interests and study habits, were used. Three classification techniques, namely Logistic Regression, K Nearest Neighbour, and Random Forest, were used in our predictive model. Based on our findings, Random Forest was determined to be the strongest predictor with an Area Under the Curve (AUC) value of 0.994. Correspondingly, the Accuracy, Precision, Recall, and F-Score were also highest among these three classifiers. Using this Random Forest Classification technique, students at risk of failure could be identified at the end of term one. They could then be assigned to a Learning Support Programme at the beginning of term two. This paper gathers the results of our findings. It also proposes further improvements that can be made to the model.

Keywords: continual assessment, predictive analytics, random forest, student psychological profile

Procedia PDF Downloads 102
2612 Solving Weighted Number of Operation Plus Processing Time Due-Date Assignment, Weighted Scheduling and Process Planning Integration Problem Using Genetic and Simulated Annealing Search Methods

Authors: Halil Ibrahim Demir, Caner Erden, Mumtaz Ipek, Ozer Uygun

Abstract:

Traditionally, the three important manufacturing functions, which are process planning, scheduling and due-date assignment, are performed separately and sequentially. For couple of decades, hundreds of studies are done on integrated process planning and scheduling problems and numerous researches are performed on scheduling with due date assignment problem, but unfortunately the integration of these three important functions are not adequately addressed. Here, the integration of these three important functions is studied by using genetic, random-genetic hybrid, simulated annealing, random-simulated annealing hybrid and random search techniques. As well, the importance of the integration of these three functions and the power of meta-heuristics and of hybrid heuristics are studied.

Keywords: process planning, weighted scheduling, weighted due-date assignment, genetic search, simulated annealing, hybrid meta-heuristics

Procedia PDF Downloads 452
2611 Backward-Facing Step Measurements at Different Reynolds Numbers Using Acoustic Doppler Velocimetry

Authors: Maria Amelia V. C. Araujo, Billy J. Araujo, Brian Greenwood

Abstract:

The flow over a backward-facing step is characterized by the presence of flow separation, recirculation and reattachment, for a simple geometry. This type of fluid behaviour takes place in many practical engineering applications, hence the reason for being investigated. Historically, fluid flows over a backward-facing step have been examined in many experiments using a variety of measuring techniques such as laser Doppler velocimetry (LDV), hot-wire anemometry, particle image velocimetry or hot-film sensors. However, some of these techniques cannot conveniently be used in separated flows or are too complicated and expensive. In this work, the applicability of the acoustic Doppler velocimetry (ADV) technique is investigated to such type of flows, at various Reynolds numbers corresponding to different flow regimes. The use of this measuring technique in separated flows is very difficult to find in literature. Besides, most of the situations where the Reynolds number effect is evaluated in separated flows are in numerical modelling. The ADV technique has the advantage in providing nearly non-invasive measurements, which is important in resolving turbulence. The ADV Nortek Vectrino+ was used to characterize the flow, in a recirculating laboratory flume, at various Reynolds Numbers (Reh = 3738, 5452, 7908 and 17388) based on the step height (h), in order to capture different flow regimes, and the results compared to those obtained using other measuring techniques. To compare results with other researchers, the step height, expansion ratio and the positions upstream and downstream the step were reproduced. The post-processing of the AVD records was performed using a customized numerical code, which implements several filtering techniques. Subsequently, the Vectrino noise level was evaluated by computing the power spectral density for the stream-wise horizontal velocity component. The normalized mean stream-wise velocity profiles, skin-friction coefficients and reattachment lengths were obtained for each Reh. Turbulent kinetic energy, Reynolds shear stresses and normal Reynolds stresses were determined for Reh = 7908. An uncertainty analysis was carried out, for the measured variables, using the moving block bootstrap technique. Low noise levels were obtained after implementing the post-processing techniques, showing their effectiveness. Besides, the errors obtained in the uncertainty analysis were relatively low, in general. For Reh = 7908, the normalized mean stream-wise velocity and turbulence profiles were compared directly with those acquired by other researchers using the LDV technique and a good agreement was found. The ADV technique proved to be able to characterize the flow properly over a backward-facing step, although additional caution should be taken for measurements very close to the bottom. The ADV measurements showed reliable results regarding: a) the stream-wise velocity profiles; b) the turbulent shear stress; c) the reattachment length; d) the identification of the transition from transitional to turbulent flows. Despite being a relatively inexpensive technique, acoustic Doppler velocimetry can be used with confidence in separated flows and thus very useful for numerical model validation. However, it is very important to perform adequate post-processing of the acquired data, to obtain low noise levels, thus decreasing the uncertainty.

Keywords: ADV, experimental data, multiple Reynolds number, post-processing

Procedia PDF Downloads 109
2610 An Adaptive Conversational AI Approach for Self-Learning

Authors: Airy Huang, Fuji Foo, Aries Prasetya Wibowo

Abstract:

In recent years, the focus of Natural Language Processing (NLP) development has been gradually shifting from the semantics-based approach to deep learning one, which performs faster with fewer resources. Although it performs well in many applications, the deep learning approach, due to the lack of semantics understanding, has difficulties in noticing and expressing a novel business case with a pre-defined scope. In order to meet the requirements of specific robotic services, deep learning approach is very labor-intensive and time consuming. It is very difficult to improve the capabilities of conversational AI in a short time, and it is even more difficult to self-learn from experiences to deliver the same service in a better way. In this paper, we present an adaptive conversational AI algorithm that combines both semantic knowledge and deep learning to address this issue by learning new business cases through conversations. After self-learning from experience, the robot adapts to the business cases originally out of scope. The idea is to build new or extended robotic services in a systematic and fast-training manner with self-configured programs and constructed dialog flows. For every cycle in which a chat bot (conversational AI) delivers a given set of business cases, it is trapped to self-measure its performance and rethink every unknown dialog flows to improve the service by retraining with those new business cases. If the training process reaches a bottleneck and incurs some difficulties, human personnel will be informed of further instructions. He or she may retrain the chat bot with newly configured programs, or new dialog flows for new services. One approach employs semantics analysis to learn the dialogues for new business cases and then establish the necessary ontology for the new service. With the newly learned programs, it completes the understanding of the reaction behavior and finally uses dialog flows to connect all the understanding results and programs, achieving the goal of self-learning process. We have developed a chat bot service mounted on a kiosk, with a camera for facial recognition and a directional microphone array for voice capture. The chat bot serves as a concierge with polite conversation for visitors. As a proof of concept. We have demonstrated to complete 90% of reception services with limited self-learning capability.

Keywords: conversational AI, chatbot, dialog management, semantic analysis

Procedia PDF Downloads 104
2609 Waste Heat Recovery Using Spiral Heat Exchanger

Authors: Parthiban S. R.

Abstract:

Spiral heat exchangers are known as excellent heat exchanger because of far compact and high heat transfer efficiency. An innovative spiral heat exchanger based on polymer materials is designed for waste heat recovery process. Such a design based on polymer film technology provides better corrosion and chemical resistance compared to conventional metal heat exchangers. Due to the smooth surface of polymer film fouling is reduced. A new arrangement for flow of hot flue gas and cold fluid is employed for design, flue gas flows in axial path while the cold fluid flows in a spiral path. Heat load recovery achieved with the presented heat exchanger is in the range of 1.5 kW thermic but potential heat recovery about 3.5 kW might be achievable. To measure the performance of the spiral tube heat exchanger, its model is suitably designed and fabricated so as to perform experimental tests. The paper gives analysis of spiral tube heat exchanger.

Keywords: spiral heat exchanger, polymer based materials, fouling factor, heat load

Procedia PDF Downloads 364
2608 Seismic Response Mitigation of Structures Using Base Isolation System Considering Uncertain Parameters

Authors: Rama Debbarma

Abstract:

The present study deals with the performance of Linear base isolation system to mitigate seismic response of structures characterized by random system parameters. This involves optimization of the tuning ratio and damping properties of the base isolation system considering uncertain system parameters. However, the efficiency of base isolator may reduce if it is not tuned to the vibrating mode it is designed to suppress due to unavoidable presence of system parameters uncertainty. With the aid of matrix perturbation theory and first order Taylor series expansion, the total probability concept is used to evaluate the unconditional response of the primary structures considering random system parameters. For this, the conditional second order information of the response quantities are obtained in random vibration framework using state space formulation. Subsequently, the maximum unconditional root mean square displacement of the primary structures is used as the objective function to obtain optimum damping parameters Numerical study is performed to elucidate the effect of parameters uncertainties on the optimization of parameters of linear base isolator and system performance.

Keywords: linear base isolator, earthquake, optimization, uncertain parameters

Procedia PDF Downloads 399
2607 Machine Learning-Driven Prediction of Cardiovascular Diseases: A Supervised Approach

Authors: Thota Sai Prakash, B. Yaswanth, Jhade Bhuvaneswar, Marreddy Divakar Reddy, Shyam Ji Gupta

Abstract:

Across the globe, there are a lot of chronic diseases, and heart disease stands out as one of the most perilous. Sadly, many lives are lost to this condition, even though early intervention could prevent such tragedies. However, identifying heart disease in its initial stages is not easy. To address this challenge, we propose an automated system aimed at predicting the presence of heart disease using advanced techniques. By doing so, we hope to empower individuals with the knowledge needed to take proactive measures against this potentially fatal illness. Our approach towards this problem involves meticulous data preprocessing and the development of predictive models utilizing classification algorithms such as Support Vector Machines (SVM), Decision Tree, and Random Forest. We assess the efficiency of every model based on metrics like accuracy, ensuring that we select the most reliable option. Additionally, we conduct thorough data analysis to reveal the importance of different attributes. Among the models considered, Random Forest emerges as the standout performer with an accuracy rate of 96.04% in our study.

Keywords: support vector machines, decision tree, random forest

Procedia PDF Downloads 13
2606 Performance Comparison of Cooperative Banks in the EU, USA and Canada

Authors: Matěj Kuc

Abstract:

This paper compares different types of profitability measures of cooperative banks from two developed regions: the European Union and the United States of America together with Canada. We created balanced dataset of more than 200 cooperative banks covering 2011-2016 period. We made series of tests and run Random Effects estimation on panel data. We found that American and Canadian cooperatives are more profitable in terms of return on assets (ROA) and return on equity (ROE). There is no significant difference in net interest margin (NIM). Our results show that the North American cooperative banks accommodated better to the current market environment.

Keywords: cooperative banking, panel data, profitability measures, random effects

Procedia PDF Downloads 94
2605 A Sequential Approach for Random-Effects Meta-Analysis

Authors: Samson Henry Dogo, Allan Clark, Elena Kulinskaya

Abstract:

The objective in meta-analysis is to combine results from several independent studies in order to create generalization and provide evidence based for decision making. But recent studies show that the magnitude of effect size estimates reported in many areas of research finding changed with year publication and this can impair the results and conclusions of meta-analysis. A number of sequential methods have been proposed for monitoring the effect size estimates in meta-analysis. However they are based on statistical theory applicable to fixed effect model (FEM). For random-effects model (REM), the analysis incorporates the heterogeneity variance, tau-squared and its estimation create complications. In this paper proposed the use of Gombay and Serbian (2005) truncated CUSUM-type test with asymptotically valid critical values for sequential monitoring of REM. Simulation results show that the test does not control the Type I error well, and is not recommended. Further work required to derive an appropriate test in this important area of application.

Keywords: meta-analysis, random-effects model, sequential test, temporal changes in effect sizes

Procedia PDF Downloads 438
2604 Wall Pressure Fluctuations in Naturally Developing Boundary Layer Flows on Axisymmetric Bodies

Authors: Chinsuk Hong

Abstract:

This paper investigates the characteristics of wall pressure fluctuations in naturally developing boundary layer flows on axisymmetric bodies experimentally. The axisymmetric body has a modified ellipsoidal blunt nose. Flush-mounted microphones are used to measure the wall pressure fluctuations in the boundary layer flow over the body. The measurements are performed in a low noise wind tunnel. It is found that the correlation between the flow regime and the characteristics of the pressure fluctuations is distinct. The process from small fluctuation in laminar flow to large fluctuation in turbulent flow is investigated. Tollmien-Schlichting wave (T-S wave) is found to generate and develop in transition. Because of the T-S wave, the wall pressure fluctuations in the transition region are higher than those in the turbulent boundary layer.

Keywords: wall pressure fluctuation, boundary layer flow, transition, turbulent flow, axisymmetric body, flow noise

Procedia PDF Downloads 323
2603 The Superhydrophobic Surface Effect on Laminar Boundary Layer Flows

Authors: Chia-Yung Chou, Che-Chuan Cheng, Chin Chi Hsu, Chun-Hui Wu

Abstract:

This study investigates the fluid of boundary layer flow as it flows through the superhydrophobic surface. The superhydrophobic surface will be assembled into an observation channel for fluid experiments. The fluid in the channel will be doped with visual flow field particles, which will then be pumped by the syringe pump and introduced into the experimentally observed channel through the pipeline. Through the polarized light irradiation, the movement of the particles in the channel is captured by a high-speed camera, and the velocity of the particles is analyzed by MATLAB to find out the particle velocity field changes caused on the fluid boundary layer. This study found that the superhydrophobic surface can effectively increase the velocity near the wall surface, and the faster with the flow rate increases. The superhydrophobic surface also had longer the slip length compared with the plan surface. In the calculation of the drag coefficient, the superhydrophobic surface produces a lower drag coefficient, and there is a more significant difference when the Re reduced in the flow field.

Keywords: hydrophobic, boundary layer, slip length, friction

Procedia PDF Downloads 119
2602 Discontinuous Spacetime with Vacuum Holes as Explanation for Gravitation, Quantum Mechanics and Teleportation

Authors: Constantin Z. Leshan

Abstract:

Hole Vacuum theory is based on discontinuous spacetime that contains vacuum holes. Vacuum holes can explain gravitation, some laws of quantum mechanics and allow teleportation of matter. All massive bodies emit a flux of holes which curve the spacetime; if we increase the concentration of holes, it leads to length contraction and time dilation because the holes do not have the properties of extension and duration. In the limited case when space consists of holes only, the distance between every two points is equal to zero and time stops - outside of the Universe, the extension and duration properties do not exist. For this reason, the vacuum hole is the only particle in physics capable of describing gravitation using its own properties only. All microscopic particles must 'jump' continually and 'vibrate' due to the appearance of holes (impassable microscopic 'walls' in space), and it is the cause of the quantum behavior. Vacuum holes can explain the entanglement, non-locality, wave properties of matter, tunneling, uncertainty principle and so on. Particles do not have trajectories because spacetime is discontinuous and has impassable microscopic 'walls' due to the simple mechanical motion is impossible at small scale distances; it is impossible to 'trace' a straight line in the discontinuous spacetime because it contains the impassable holes. Spacetime 'boils' continually due to the appearance of the vacuum holes. For teleportation to be possible, we must send a body outside of the Universe by enveloping it with a closed surface consisting of vacuum holes. Since a material body cannot exist outside of the Universe, it reappears instantaneously in a random point of the Universe. Since a body disappears in one volume and reappears in another random volume without traversing the physical space between them, such a transportation method can be called teleportation (or Hole Teleportation). It is shown that Hole Teleportation does not violate causality and special relativity due to its random nature and other properties. Although Hole Teleportation has a random nature, it can be used for colonization of extrasolar planets by the help of the method called 'random jumps': after a large number of random teleportation jumps, there is a probability that the spaceship may appear near a habitable planet. We can create vacuum holes experimentally using the method proposed by Descartes: we must remove a body from the vessel without permitting another body to occupy this volume.

Keywords: border of the Universe, causality violation, perfect isolation, quantum jumps

Procedia PDF Downloads 394
2601 Application of Random Forest Model in The Prediction of River Water Quality

Authors: Turuganti Venkateswarlu, Jagadeesh Anmala

Abstract:

Excessive runoffs from various non-point source land uses, and other point sources are rapidly contaminating the water quality of streams in the Upper Green River watershed, Kentucky, USA. It is essential to maintain the stream water quality as the river basin is one of the major freshwater sources in this province. It is also important to understand the water quality parameters (WQPs) quantitatively and qualitatively along with their important features as stream water is sensitive to climatic events and land-use practices. In this paper, a model was developed for predicting one of the significant WQPs, Fecal Coliform (FC) from precipitation, temperature, urban land use factor (ULUF), agricultural land use factor (ALUF), and forest land-use factor (FLUF) using Random Forest (RF) algorithm. The RF model, a novel ensemble learning algorithm, can even find out advanced feature importance characteristics from the given model inputs for different combinations. This model’s outcomes showed a good correlation between FC and climate events and land use factors (R2 = 0.94) and precipitation and temperature are the primary influencing factors for FC.

Keywords: water quality, land use factors, random forest, fecal coliform

Procedia PDF Downloads 168
2600 Secure Watermarking not at the Cost of Low Robustness

Authors: Jian Cao

Abstract:

This paper describes a novel watermarking technique which we call the random direction embedding (RDE) watermarking. Unlike traditional watermarking techniques, the watermark energy after the RDE embedding does not focus on a fixed direction, leading to the security against the traditional unauthorized watermark removal attack. In addition, the experimental results show that when compared with the existing secure watermarking, namely natural watermarking (NW), the RDE watermarking gains significant improvement in terms of robustness. In fact, the security of the RDE watermarking is not at the cost of low robustness, and it can even achieve more robust than the traditional spread spectrum watermarking, which has been shown to be very insecure.

Keywords: robustness, spread spectrum watermarking, watermarking security, random direction embedding (RDE)

Procedia PDF Downloads 360
2599 A Study of Non Linear Partial Differential Equation with Random Initial Condition

Authors: Ayaz Ahmad

Abstract:

In this work, we present the effect of noise on the solution of a partial differential equation (PDE) in three different setting. We shall first consider random initial condition for two nonlinear dispersive PDE the non linear Schrodinger equation and the Kortteweg –de vries equation and analyse their effect on some special solution , the soliton solutions.The second case considered a linear partial differential equation , the wave equation with random initial conditions allow to substantially decrease the computational and data storage costs of an algorithm to solve the inverse problem based on the boundary measurements of the solution of this equation. Finally, the third example considered is that of the linear transport equation with a singular drift term, when we shall show that the addition of a multiplicative noise term forbids the blow up of solutions under a very weak hypothesis for which we have finite time blow up of a solution in the deterministic case. Here we consider the problem of wave propagation, which is modelled by a nonlinear dispersive equation with noisy initial condition .As observed noise can also be introduced directly in the equations.

Keywords: drift term, finite time blow up, inverse problem, soliton solution

Procedia PDF Downloads 185
2598 Investigation Effect of External Flow to Exhaust Gas Flow at Heavy Commercial Vehicle with CFD

Authors: F. Kantaş, D. Boyacı, C. Dinç

Abstract:

Exhaust systems plays an important role in thermal heat management. Exhaust manifold picks burned gas from engine and exhaust pipes transmit exhaust gas to muffler, exhaust gas is reacted chemically to avoid noxious gas and sound is reduced in muffler then gas is threw out with tail pipe from muffler. Exhaust gas flows out from tail pipe and this hot gas flows to many parts that available around tail pipe and muffler, like spare tire, transmission, pipes etc. These parts are heated by hot exhaust gas. Also vehicle on ride, external flow effects exhaust gas flow and exhaust gas behavior is changed. It's impossible to understand which parts are heated by hot exhaust gas in tests. To understand this phenomena, exhaust gas flow is solved in CFD also external flow due to vehicle movement must be solved with exhaust gas flow. Because external flow effects exhaust gas flow behavior with many parameters. This paper investigates external flow effects exhaust gas flow behavior and other critical parameters effect exhaust gas flow behavior, like different tail pipe design, exhaust gas mass flow in critic vehicle driving situations.

Keywords: exhaust, gas flow, vehicle, external flow

Procedia PDF Downloads 412
2597 Polymer Spiral Film Gas-Liquid Heat Exchanger for Waste Heat Recovery in Exhaust Gases

Authors: S. R. Parthiban, C. Elajchet Senni

Abstract:

Spiral heat exchangers are known as excellent heat exchanger because of far compact and high heat transfer efficiency. An innovative spiral heat exchanger based on polymer materials is designed for waste heat recovery process. Such a design based on polymer film technology provides better corrosion and chemical resistance compared to conventional metal heat exchangers. Due to the smooth surface of polymer film fouling is reduced. A new arrangement for flow of hot flue gas and cold fluid is employed for design, flue gas flows in axial path while the cold fluid flows in a spiral path. Heat load recovery achieved with the presented heat exchanger is in the range of 1.5 kW thermic but potential heat recovery about 3.5kW might be achievable. To measure the performance of the spiral tube heat exchanger, its model is suitably designed and fabricated so as to perform experimental tests. The paper gives analysis of spiral tube heat exchanger.

Keywords: spiral heat exchanger, polymer based materials, fouling factor, heat load

Procedia PDF Downloads 349
2596 Efficient Signcryption Scheme with Provable Security for Smart Card

Authors: Jayaprakash Kar, Daniyal M. Alghazzawi

Abstract:

The article proposes a novel construction of signcryption scheme with provable security which is most suited to implement on smart card. It is secure in random oracle model and the security relies on Decisional Bilinear Diffie-Hellmann Problem. The proposed scheme is secure against adaptive chosen ciphertext attack (indistiguishbility) and adaptive chosen message attack (unforgebility). Also, it is inspired by zero-knowledge proof. The two most important security goals for smart card are Confidentiality and authenticity. These functions are performed in one logical step in low computational cost.

Keywords: random oracle, provable security, unforgebility, smart card

Procedia PDF Downloads 569