Search results for: computational accuracy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5461

Search results for: computational accuracy

2011 Fault-Tolerant Predictive Control for Polytopic LPV Systems Subject to Sensor Faults

Authors: Sofiane Bououden, Ilyes Boulkaibet

Abstract:

In this paper, a robust fault-tolerant predictive control (FTPC) strategy is proposed for systems with linear parameter varying (LPV) models and input constraints subject to sensor faults. Generally, virtual observers are used for improving the observation precision and reduce the impacts of sensor faults and uncertainties in the system. However, this type of observer lacks certain system measurements which substantially reduce its accuracy. To deal with this issue, a real observer is then designed based on the virtual observer, and consequently a real observer-based robust predictive control is designed for polytopic LPV systems. Moreover, the proposed observer can entirely assure that all system states and sensor faults are estimated. As a result, and based on both observers, a robust fault-tolerant predictive control is then established via the Lyapunov method where sufficient conditions are proposed, for stability analysis and control purposes, in linear matrix inequalities (LMIs) form. Finally, simulation results are given to show the effectiveness of the proposed approach.

Keywords: linear parameter varying systems, fault-tolerant predictive control, observer-based control, sensor faults, input constraints, linear matrix inequalities

Procedia PDF Downloads 200
2010 Evaluation of the Accuracy of a ‘Two Question Screening Tool’ in the Detection of Intimate Partner Violence in a Primary Healthcare Setting in South Africa

Authors: A. Saimen, E. Armstrong, C. Manitshana

Abstract:

Intimate partner violence (IPV) has been recognised as a global human rights violation. It is universally under diagnosed and the institution of timeous multi-faceted interventions has been noted to benefit IPV victims. Currently, the concept of using a screening tool to detect IPV has not been widely explored in a primary healthcare setting in South Africa, and it was for this reason that this study has been undertaken. A systematic random sampling of 1 in 8 women over a period of 3 months was conducted prospectively at the OPD of a Level 1 Hospital. Participants were asked about their experience of IPV during the past 12 months. The WAST-short, a two-question tool, was used to screen patients for IPV. To verify the result of the screening, women were also asked the remaining questions from the WAST. Data was collected from 400 participants, with a response rate of 99.3%. The prevalence of IPV in the sample was 32%. The WAST-short was shown to have the following operating characteristics: sensitivity 45.2%, specificity 98%,positive predictive value 98%, negative predictive value 79%. The WAST-short lacks sufficient sensitivity and therefore is not an ideal screening tool for this setting. Improvement in the sensitivity of the WAST-short in this setting may be achieved by lowering the threshold for a positive result for IPV screening, and modification of the screening questions to better reflect IPV as understood by the local population.

Keywords: domestic violence, intimate partner violence, screening, screening tools

Procedia PDF Downloads 305
2009 The Reproducibility and Repeatability of Modified Likelihood Ratio for Forensics Handwriting Examination

Authors: O. Abiodun Adeyinka, B. Adeyemo Adesesan

Abstract:

The forensic use of handwriting depends on the analysis, comparison, and evaluation decisions made by forensic document examiners. When using biometric technology in forensic applications, it is necessary to compute Likelihood Ratio (LR) for quantifying strength of evidence under two competing hypotheses, namely the prosecution and the defense hypotheses wherein a set of assumptions and methods for a given data set will be made. It is therefore important to know how repeatable and reproducible our estimated LR is. This paper evaluated the accuracy and reproducibility of examiners' decisions. Confidence interval for the estimated LR were presented so as not get an incorrect estimate that will be used to deliver wrong judgment in the court of Law. The estimate of LR is fundamentally a Bayesian concept and we used two LR estimators, namely Logistic Regression (LoR) and Kernel Density Estimator (KDE) for this paper. The repeatability evaluation was carried out by retesting the initial experiment after an interval of six months to observe whether examiners would repeat their decisions for the estimated LR. The experimental results, which are based on handwriting dataset, show that LR has different confidence intervals which therefore implies that LR cannot be estimated with the same certainty everywhere. Though the LoR performed better than the KDE when tested using the same dataset, the two LR estimators investigated showed a consistent region in which LR value can be estimated confidently. These two findings advance our understanding of LR when used in computing the strength of evidence in handwriting using forensics.

Keywords: confidence interval, handwriting, kernel density estimator, KDE, logistic regression LoR, repeatability, reproducibility

Procedia PDF Downloads 124
2008 Difference Between Planning Target Volume (PTV) Based Slow-Ct and Internal Target Volume (ITV) Based 4DCT Imaging Techniques in Stereotactic Body Radiotherapy for Lung Cancer: A Comparative Study

Authors: Madhumita Sahu, S. S. Tiwary

Abstract:

The Radiotherapy of Carcinoma Lung has always been difficult and a matter of great concern. The significant movement due to fractional motion caused due to non-rhythmic respiratory motion poses a great challenge for the treatment of Lung cancer using Ionizing Radiation. The present study compares the accuracy in the measurement of Target Volume using Slow-CT and 4DCT Imaging in SBRT for Lung Tumor. The experimental samples were extracted from patients with Lung Cancer who underwent SBRT. Slow-CT and 4DCT images were acquired under free breathing for each patient. PTV were delineated on Slow CT images. Similarly, ITV was also delineated on each of the 4DCT volumes. Volumetric and Statistical analysis were performed for each patient by measuring corresponding PTV and ITV volumes. The study showed (1) The Maximum Deviation observed between Slow-CT-based PTV and 4DCT imaging-based ITV is 248.58 cc. (2) The Minimum Deviation observed between Slow-CT-based PTV and 4DCT imaging-based ITV is 5.22 cc. (3) The Mean Deviation observed between Slow-CT-based PTV and 4DCT imaging-based ITV is 63.21 cc. The present study concludes that irradiated volume ITV with 4DCT is less as compared to the PTV with Slow-CT. A better and more precise treatment could be given more accurately with 4DCT Imaging by sparing 63.21 CC of mean body volume.

Keywords: CT imaging, 4DCT imaging, lung cancer, statistical analysis

Procedia PDF Downloads 24
2007 A Cross-Sectional Examination of Children’s Developing Understanding of the Rainbow

Authors: Michael Hast

Abstract:

Surprisingly little is known from a research perspective about children’s understanding of rainbows and rainbow formation, and how this understanding changes with increasing age. Yet this kind of research is useful when conceptualizing pedagogy, lesson plans, or more general curricula. The present study aims to rectify this shortcoming. In a cross-sectional approach, children of three different age groups (4-5, 7-8 and 10-11 years) were asked to draw pictures that included rainbows. The pictures will be evaluated according to their scientific representation of rainbows, such as the order of colors, as well as according to any non-scientific conceptions, such as solidity. In addition to the drawings, the children took part in small focus groups where they had to discuss various questions about rainbows and rainbow formation. Similar to the drawings, these conversations will be evaluated around the degree of scientific accuracy of the children’s explanations. Gaining a complete developmental picture of children’s understanding of the rainbow may have important implications for pedagogy in early science education. Many other concepts in science, while not explicitly linked to rainbows and rainbow formation, can benefit from the use of rainbows as illustrations – such as understanding light and color, or the use of prisms. Even in non-science domains, such as art and even storytelling, recognizing the differentiation between fact and myth in relation to rainbows could be of value. In addition, research has pointed out that teachers tend to overestimate the proportion of students’ correct answers, so clarifying the actual level of conceptual understanding is crucial in this respect.

Keywords: conceptual development, cross-sectional research, primary science education, rainbows

Procedia PDF Downloads 215
2006 A Neural Network Classifier for Estimation of the Degree of Infestation by Late Blight on Tomato Leaves

Authors: Gizelle K. Vianna, Gabriel V. Cunha, Gustavo S. Oliveira

Abstract:

Foliage diseases in plants can cause a reduction in both quality and quantity of agricultural production. Intelligent detection of plant diseases is an essential research topic as it may help monitoring large fields of crops by automatically detecting the symptoms of foliage diseases. This work investigates ways to recognize the late blight disease from the analysis of tomato digital images, collected directly from the field. A pair of multilayer perceptron neural network analyzes the digital images, using data from both RGB and HSL color models, and classifies each image pixel. One neural network is responsible for the identification of healthy regions of the tomato leaf, while the other identifies the injured regions. The outputs of both networks are combined to generate the final classification of each pixel from the image and the pixel classes are used to repaint the original tomato images by using a color representation that highlights the injuries on the plant. The new images will have only green, red or black pixels, if they came from healthy or injured portions of the leaf, or from the background of the image, respectively. The system presented an accuracy of 97% in detection and estimation of the level of damage on the tomato leaves caused by late blight.

Keywords: artificial neural networks, digital image processing, pattern recognition, phytosanitary

Procedia PDF Downloads 327
2005 Numerical Heat Transfer Performance of Water-Based Graphene Nanoplatelets

Authors: Ahmad Amiri, Hamed K. Arzani, S. N. Kazi, B. T. Chew

Abstract:

Since graphene nanoplatelet (GNP) is a promising material due to desirable thermal properties, this paper is related to the thermophysical and heat transfer performance of covalently functionalized GNP-based water/ethylene glycol nanofluid through an annular channel. After experimentally measuring thermophysical properties of prepared samples, a computational fluid dynamics study has been carried out to examine the heat transfer and pressure drop of well-dispersed and stabilized nanofluids. The effect of concentration of GNP and Reynolds number at constant wall temperature boundary condition under turbulent flow regime on convective heat transfer coefficient has been investigated. Based on the results, for different Reynolds numbers, the convective heat transfer coefficient of the prepared nanofluid is higher than that of the base fluid. Also, the enhancement of convective heat transfer coefficient and thermal conductivity increase with the increase of GNP concentration in base-fluid. Based on the results of this investigation, there is a significant enhancement on the heat transfer rate associated with loading well-dispersed GNP in base-fluid.

Keywords: nanofluid, turbulent flow, forced convection flow, graphene, annular, annulus

Procedia PDF Downloads 356
2004 A Review on Higher-Order Spline Techniques for Solving Burgers Equation Using B-Spline Methods and Variation of B-Spline Techniques

Authors: Maryam Khazaei Pool, Lori Lewis

Abstract:

This is a summary of articles based on higher order B-splines methods and the variation of B-spline methods such as Quadratic B-spline Finite Elements Method, Exponential Cubic B-Spline Method, Septic B-spline Technique, Quintic B-spline Galerkin Method, and B-spline Galerkin Method based on the Quadratic B-spline Galerkin method (QBGM) and Cubic B-spline Galerkin method (CBGM). In this paper, we study the B-spline methods and variations of B-spline techniques to find a numerical solution to the Burgers’ equation. A set of fundamental definitions, including Burgers equation, spline functions, and B-spline functions, are provided. For each method, the main technique is discussed as well as the discretization and stability analysis. A summary of the numerical results is provided, and the efficiency of each method presented is discussed. A general conclusion is provided where we look at a comparison between the computational results of all the presented schemes. We describe the effectiveness and advantages of these methods.

Keywords: Burgers’ equation, Septic B-spline, modified cubic B-spline differential quadrature method, exponential cubic B-spline technique, B-spline Galerkin method, quintic B-spline Galerkin method

Procedia PDF Downloads 126
2003 Blind Channel Estimation for Frequency Hopping System Using Subspace Based Method

Authors: M. M. Qasaymeh, M. A. Khodeir

Abstract:

Subspace channel estimation methods have been studied widely. It depends on subspace decomposition of the covariance matrix to separate signal subspace from noise subspace. The decomposition normally is done by either Eigenvalue Decomposition (EVD) or Singular Value Decomposition (SVD) of the Auto-Correlation matrix (ACM). However, the subspace decomposition process is computationally expensive. In this paper, the multipath channel estimation problem for a Slow Frequency Hopping (SFH) system using noise space based method is considered. An efficient method to estimate multipath the time delays basically is proposed, by applying MUltiple Signal Classification (MUSIC) algorithm which used the null space extracted by the Rank Revealing LU factorization (RRLU). The RRLU provides accurate information about the rank and the numerical null space which make it a valuable tool in numerical linear algebra. The proposed novel method decreases the computational complexity approximately to the half compared with RRQR methods keeping the same performance. Computer simulations are also included to demonstrate the effectiveness of the proposed scheme.

Keywords: frequency hopping, channel model, time delay estimation, RRLU, RRQR, MUSIC, LS-ESPRIT

Procedia PDF Downloads 410
2002 Estimation of Implicit Colebrook White Equation by Preferable Explicit Approximations in the Practical Turbulent Pipe Flow

Authors: Itissam Abuiziah

Abstract:

In several hydraulic systems, it is necessary to calculate the head losses which depend on the resistance flow friction factor in Darcy equation. Computing the resistance friction is based on implicit Colebrook-White equation which is considered as the standard for the friction calculation, but it needs high computational cost, therefore; several explicit approximation methods are used for solving an implicit equation to overcome this issue. It follows that the relative error is used to determine the most accurate method among the approximated used ones. Steel, cast iron and polyethylene pipe materials investigated with practical diameters ranged from 0.1m to 2.5m and velocities between 0.6m/s to 3m/s. In short, the results obtained show that the suitable method for some cases may not be accurate for other cases. For example, when using steel pipe materials, Zigrang and Silvester's method has revealed as the most precise in terms of low velocities 0.6 m/s to 1.3m/s. Comparatively, Halland method showed a less relative error with the gradual increase in velocity. Accordingly, the simulation results of this study might be employed by the hydraulic engineers, so they can take advantage to decide which is the most applicable method according to their practical pipe system expectations.

Keywords: Colebrook–White, explicit equation, friction factor, hydraulic resistance, implicit equation, Reynolds numbers

Procedia PDF Downloads 187
2001 Privacy Preserving in Association Rule Mining on Horizontally Partitioned Database

Authors: Manvar Sagar, Nikul Virpariya

Abstract:

The advancement in data mining techniques plays an important role in many applications. In context of privacy and security issues, the problems caused by association rule mining technique are investigated by many research scholars. It is proved that the misuse of this technique may reveal the database owner’s sensitive and private information to others. Many researchers have put their effort to preserve privacy in Association Rule Mining. Amongst the two basic approaches for privacy preserving data mining, viz. Randomization based and Cryptography based, the later provides high level of privacy but incurs higher computational as well as communication overhead. Hence, it is necessary to explore alternative techniques that improve the over-heads. In this work, we propose an efficient, collusion-resistant cryptography based approach for distributed Association Rule mining using Shamir’s secret sharing scheme. As we show from theoretical and practical analysis, our approach is provably secure and require only one time a trusted third party. We use secret sharing for privately sharing the information and code based identification scheme to add support against malicious adversaries.

Keywords: Privacy, Privacy Preservation in Data Mining (PPDM), horizontally partitioned database, EMHS, MFI, shamir secret sharing

Procedia PDF Downloads 408
2000 Finite Element Method for Modal Analysis of FGM

Authors: S. J. Shahidzadeh Tabatabaei, A. M. Fattahi

Abstract:

Modal analysis of a FGM plate containing the ceramic phase of Al2O3 and metal phase of stainless steel 304 was performed using ABAQUS, with the assumptions that the material has an elastic mechanical behavior and its Young modulus and density are varying in thickness direction. For this purpose, a subroutine was written in FOTRAN and linked with ABAQUS. First, a simulation was performed in accordance to other researcher’s model, and then after comparing the obtained results, the accuracy of the present study was verified. The obtained results for natural frequency and mode shapes indicate good performance of user-written subroutine as well as FEM model used in present study. After verification of obtained results, the effect of clamping condition and the material type (i.e. the parameter n) was investigated. In this respect, finite element analysis was carried out in fully clamped condition for different values of n. The results indicate that the natural frequency decreases with increase of n, since with increase of n, the amount of ceramic phase in FGM plate decreases, while the amount of metal phase increases, leading to decrease of the plate stiffness and hence, natural frequency, as the Young modulus of Al2O3 is equal to 380 GPa and the Young modulus of stainless steel 304 is equal to 207 GPa.

Keywords: FGM plates, modal analysis, natural frequency, finite element method

Procedia PDF Downloads 317
1999 Scale Effects on the Wake Airflow of a Heavy Truck

Authors: Aude Pérard Lecomte, Georges Fokoua, Amine Mehel, Anne Tanière

Abstract:

Air quality in urban areas is deteriorated by pollution, mainly due to the constant increase of the traffic of different types of ground vehicles. In particular, particulate matter pollution with important concentrations in urban areas can cause serious health issues. Characterizing and understanding particle dynamics is therefore essential to establish recommendations to improve air quality in urban areas. To analyze the effects of turbulence on particulate pollutants dispersion, the first step is to focus on the single-phase flow structure and turbulence characteristics in the wake of a heavy truck model. To achieve this, Computational Fluid Dynamics (CFD) simulations were conducted with the aim of modeling the wake airflow of a full- and reduced-scale heavy truck. The Reynolds Average Navier-Stokes (RANS) approach with the Reynolds Stress Model (RSM)as the turbulence model closure was used. The simulations highlight the apparition of a large vortex coming from the under trailer. This vortex belongs to the recirculation region, located in the near-wake of the heavy truck. These vortical structures are expected to have a strong influence on particle dynamics that are emitted by the truck.

Keywords: CDF, heavy truck, recirculation region, reduced scale

Procedia PDF Downloads 218
1998 Bridging Urban Planning and Environmental Conservation: A Regional Analysis of Northern and Central Kolkata

Authors: Tanmay Bisen, Aastha Shayla

Abstract:

This study introduces an advanced approach to tree canopy detection in urban environments and a regional analysis of Northern and Central Kolkata that delves into the intricate relationship between urban development and environmental conservation. Leveraging high-resolution drone imagery from diverse urban green spaces in Kolkata, we fine-tuned the deep forest model to enhance its precision and accuracy. Our results, characterized by an impressive Intersection over Union (IoU) score of 0.90 and a mean average precision (mAP) of 0.87, underscore the model's robustness in detecting and classifying tree crowns amidst the complexities of aerial imagery. This research not only emphasizes the importance of model customization for specific datasets but also highlights the potential of drone-based remote sensing in urban forestry studies. The study investigates the spatial distribution, density, and environmental impact of trees in Northern and Central Kolkata. The findings underscore the significance of urban green spaces in met-ropolitan cities, emphasizing the need for sustainable urban planning that integrates green infrastructure for ecological balance and human well-being.

Keywords: urban greenery, advanced spatial distribution analysis, drone imagery, deep learning, tree detection

Procedia PDF Downloads 56
1997 Numerical Study on Parallel Rear-Spoiler on Super Cars

Authors: Anshul Ashu

Abstract:

Computers are applied to the vehicle aerodynamics in two ways. One of two is Computational Fluid Dynamics (CFD) and other is Computer Aided Flow Visualization (CAFV). Out of two CFD is chosen because it shows the result with computer graphics. The simulation of flow field around the vehicle is one of the important CFD applications. The flow field can be solved numerically using panel methods, k-ε method, and direct simulation methods. The spoiler is the tool in vehicle aerodynamics used to minimize unfavorable aerodynamic effects around the vehicle and the parallel spoiler is set of two spoilers which are designed in such a manner that it could effectively reduce the drag. In this study, the standard k-ε model of the simplified version of Bugatti Veyron, Audi R8 and Porsche 911 are used to simulate the external flow field. Flow simulation is done for variable Reynolds number. The flow simulation consists of three different levels, first over the model without a rear spoiler, second for over model with single rear spoiler, and third over the model with parallel rear-spoiler. The second and third level has following parameter: the shape of the spoiler, the angle of attack and attachment position. A thorough analysis of simulations results has been found. And a new parallel spoiler is designed. It shows a little improvement in vehicle aerodynamics with a decrease in vehicle aerodynamic drag and lift. Hence, it leads to good fuel economy and traction force of the model.

Keywords: drag, lift, flow simulation, spoiler

Procedia PDF Downloads 500
1996 A Method for Identifying Unusual Transactions in E-commerce Through Extended Data Flow Conformance Checking

Authors: Handie Pramana Putra, Ani Dijah Rahajoe

Abstract:

The proliferation of smart devices and advancements in mobile communication technologies have permeated various facets of life with the widespread influence of e-commerce. Detecting abnormal transactions holds paramount significance in this realm due to the potential for substantial financial losses. Moreover, the fusion of data flow and control flow assumes a critical role in the exploration of process modeling and data analysis, contributing significantly to the accuracy and security of business processes. This paper introduces an alternative approach to identify abnormal transactions through a model that integrates both data and control flows. Referred to as the Extended Data Petri net (DPNE), our model encapsulates the entire process, encompassing user login to the e-commerce platform and concluding with the payment stage, including the mobile transaction process. We scrutinize the model's structure, formulate an algorithm for detecting anomalies in pertinent data, and elucidate the rationale and efficacy of the comprehensive system model. A case study validates the responsive performance of each system component, demonstrating the system's adeptness in evaluating every activity within mobile transactions. Ultimately, the results of anomaly detection are derived through a thorough and comprehensive analysis.

Keywords: database, data analysis, DPNE, extended data flow, e-commerce

Procedia PDF Downloads 56
1995 Passport Bros: Exploring Neocolonial Masculinity and Sex Tourism as a Response to Shifting Gender Dynamics

Authors: Kellen Sharp

Abstract:

This study explores the phenomenon of ‘Passport Bros’, a subset within the manosphere responding to perceived crises in masculinity amidst changing gender dynamics. Focusing on a computational analysis of the passport bro community, the research addresses normative beliefs, deviations from MGTOW ideology, and discussions on nationality, race, and gender. Originating from the MGTOW movement, passport bros engage in a neocolonial approach by seeking traditional, non-Western women, attributing this pursuit to dissatisfaction with modern Western women. The paper examines how hetero pessimism within MGTOW shapes the emergence of passport bros, leading to the adoption of red pill ideologies and ultimately manifesting in the form of sex tourism. Analyzing data collected from passport bro forums through computer-assisted content analysis, the study identifies key discourses such as questions and answers, money, attitudes towards Western and traditional women, and discussions about the movement itself. The findings highlight the nuanced intersection of gender, race, and global power dynamics within the passport bro community, shedding light on their motivations and impact on neocolonial legacies.

Keywords: toxic online community, manosphere, gender and media, neocolonialism

Procedia PDF Downloads 74
1994 Enhancing Word Meaning Retrieval Using FastText and Natural Language Processing Techniques

Authors: Sankalp Devanand, Prateek Agasimani, Shamith V. S., Rohith Neeraje

Abstract:

Machine translation has witnessed significant advancements in recent years, but the translation of languages with distinct linguistic characteristics, such as English and Sanskrit, remains a challenging task. This research presents the development of a dedicated English-to-Sanskrit machine translation model, aiming to bridge the linguistic and cultural gap between these two languages. Using a variety of natural language processing (NLP) approaches, including FastText embeddings, this research proposes a thorough method to improve word meaning retrieval. Data preparation, part-of-speech tagging, dictionary searches, and transliteration are all included in the methodology. The study also addresses the implementation of an interpreter pattern and uses a word similarity task to assess the quality of word embeddings. The experimental outcomes show how the suggested approach may be used to enhance word meaning retrieval tasks with greater efficacy, accuracy, and adaptability. Evaluation of the model's performance is conducted through rigorous testing, comparing its output against existing machine translation systems. The assessment includes quantitative metrics such as BLEU scores, METEOR scores, Jaccard Similarity, etc.

Keywords: machine translation, English to Sanskrit, natural language processing, word meaning retrieval, fastText embeddings

Procedia PDF Downloads 44
1993 Flow Analysis of Viscous Nanofluid Due to Rotating Rigid Disk with Navier’s Slip: A Numerical Study

Authors: Khalil Ur Rehman, M. Y. Malik, Usman Ali

Abstract:

In this paper, the problem proposed by Von Karman is treated in the attendance of additional flow field effects when the liquid is spaced above the rotating rigid disk. To be more specific, a purely viscous fluid flow yield by rotating rigid disk with Navier’s condition is considered in both magnetohydrodynamic and hydrodynamic frames. The rotating flow regime is manifested with heat source/sink and chemically reactive species. Moreover, the features of thermophoresis and Brownian motion are reported by considering nanofluid model. The flow field formulation is obtained mathematically in terms of high order differential equations. The reduced system of equations is solved numerically through self-coded computational algorithm. The pertinent outcomes are discussed systematically and provided through graphical and tabular practices. A simultaneous way of study makes this attempt attractive in this sense that the article contains dual framework and validation of results with existing work confirms the execution of self-coded algorithm for fluid flow regime over a rotating rigid disk.

Keywords: Navier’s condition, Newtonian fluid model, chemical reaction, heat source/sink

Procedia PDF Downloads 171
1992 Investigation of Single Particle Breakage inside an Impact Mill

Authors: E. Ghasemi Ardi, K. J. Dong, A. B. Yu, R. Y. Yang

Abstract:

In current work, a numerical model based on the discrete element method (DEM) was developed which provided information about particle dynamic and impact event condition inside a laboratory scale impact mill (Fritsch). It showed that each particle mostly experiences three impacts inside the mill. While the first impact frequently happens at front surface of the rotor’s rib, the frequent location of the second impact is side surfaces of the rotor’s rib. It was also showed that while the first impact happens at small impact angle mostly varying around 35º, the second impact happens at around 70º which is close to normal impact condition. Also analyzing impact energy revealed that varying mill speed from 6000 to 14000 rpm, the ratio of first impact’s average impact energy and minimum required energy to break particle (Wₘᵢₙ) increased from 0.30 to 0.85. Moreover, it was seen that second impact poses intense impact energy on particle which can be considered as the main cause of particle splitting. Finally, obtained information from DEM simulation along with obtained data from conducted experiments was implemented in semi-empirical equations in order to find selection and breakage functions. Then, using a back-calculation approach, those parameters were used to predict the PSDs of ground particles under different impact energies. Results were compared with experiment results and showed reasonable accuracy and prediction ability.

Keywords: single particle breakage, particle dynamic, population balance model, particle size distribution, discrete element method

Procedia PDF Downloads 291
1991 Route Planning for Optimization Approach PSO_GA Sharing System (Scooter Sharing-Public Transportation) with Hybrid Optimization Approach PSO_GA

Authors: Mohammad Ali Farrokhpour

Abstract:

In the current decade and sustainable transportation systems, scooter sharing has attracted widespread attention as an environmentally-friendly means of public transportation which can help develop public transportation. The combination of scooters and subway in the area of sustainable transportation systems can provide a great many opportunities for developing access to public transportation. Of the challenges which have arisen and initiated discussions of interest about the implementation of a scooter-subway system to replace personal vehicles is the issue of routing in the aforementioned system. This has been chosen as the main subject of the present paper. Thus, the present paper provides an account for routing in this system. Because the issue of routing includes multiple factors such as time, costs, traffic, green spaces, etc., the above-mentioned problem is considered to be a multi-objective NP-hard optimization problem. For this purpose, the hybrid optimization approach of PSO-GA has been put forward in the present paper for the provided answers to be of higher accuracy and validity than those of normal optimization methods. The results obtained from modeling and problem solving for the case study in the MATLAB software are indicative of the efficiency and desirability of the model and the proposed approach for solving the model

Keywords: route planning, scooter sharing, public transportation, sharing system

Procedia PDF Downloads 84
1990 An Approximate Formula for Calculating the Fundamental Mode Period of Vibration of Practical Building

Authors: Abdul Hakim Chikho

Abstract:

Most international codes allow the use of an equivalent lateral load method for designing practical buildings to withstand earthquake actions. This method requires calculating an approximation to the fundamental mode period of vibrations of these buildings. Several empirical equations have been suggested to calculate approximations to the fundamental periods of different types of structures. Most of these equations are knowing to provide an only crude approximation to the required fundamental periods and repeating the calculation utilizing a more accurate formula is usually required. In this paper, a new formula to calculate a satisfactory approximation of the fundamental period of a practical building is proposed. This formula takes into account the mass and the stiffness of the building therefore, it is more logical than the conventional empirical equations. In order to verify the accuracy of the proposed formula, several examples have been solved. In these examples, calculating the fundamental mode periods of several farmed buildings utilizing the proposed formula and the conventional empirical equations has been accomplished. Comparing the obtained results with those obtained from a dynamic computer has shown that the proposed formula provides a more accurate estimation of the fundamental periods of practical buildings. Since the proposed method is still simple to use and requires only a minimum computing effort, it is believed to be ideally suited for design purposes.

Keywords: earthquake, fundamental mode period, design, building

Procedia PDF Downloads 284
1989 Comparison between Pushover Analysis Techniques and Validation of the Simplified Modal Pushover Analysis

Authors: N. F. Hanna, A. M. Haridy

Abstract:

One of the main drawbacks of the Modal Pushover Analysis (MPA) is the need to perform nonlinear time-history analysis, which complicates the analysis method and time. A simplified version of the MPA has been proposed based on the concept of the inelastic deformation ratio. Furthermore, the effect of the higher modes of vibration is considered by assuming linearly-elastic responses, which enables the use of standard elastic response spectrum analysis. In this thesis, the simplified MPA (SMPA) method is applied to determine the target global drift and the inter-story drifts of steel frame building. The effect of the higher vibration modes is considered within the framework of the SMPA. A comprehensive survey about the inelastic deformation ratio is presented. After that, a suitable expression from literature is selected for the inelastic deformation ratio and then implemented in the SMPA. The estimated seismic demands using the SMPA, such as target drift, base shear, and the inter-story drifts, are compared with the seismic responses determined by applying the standard MPA. The accuracy of the estimated seismic demands is validated by comparing with the results obtained by the nonlinear time-history analysis using real earthquake records.

Keywords: modal analysis, pushover analysis, seismic performance, target displacement

Procedia PDF Downloads 361
1988 The Algorithm of Semi-Automatic Thai Spoonerism Words for Bi-Syllable

Authors: Nutthapat Kaewrattanapat, Wannarat Bunchongkien

Abstract:

The purposes of this research are to study and develop the algorithm of Thai spoonerism words by semi-automatic computer programs, that is to say, in part of data input, syllables are already separated and in part of spoonerism, the developed algorithm is utilized, which can establish rules and mechanisms in Thai spoonerism words for bi-syllables by utilizing analysis in elements of the syllables, namely cluster consonant, vowel, intonation mark and final consonant. From the study, it is found that bi-syllable Thai spoonerism has 1 case of spoonerism mechanism, namely transposition in value of vowel, intonation mark and consonant of both 2 syllables but keeping consonant value and cluster word (if any). From the study, the rules and mechanisms in Thai spoonerism word were applied to develop as Thai spoonerism word software, utilizing PHP program. the software was brought to conduct a performance test on software execution; it is found that the program performs bi-syllable Thai spoonerism correctly or 99% of all words used in the test and found faults on the program at 1% as the words obtained from spoonerism may not be spelling in conformity with Thai grammar and the answer in Thai spoonerism could be more than 1 answer.

Keywords: algorithm, spoonerism, computational linguistics, Thai spoonerism

Procedia PDF Downloads 236
1987 Auto Rickshaw Impacts with Pedestrians: A Computational Analysis of Post-Collision Kinematics and Injury Mechanics

Authors: A. J. Al-Graitti, G. A. Khalid, P. Berthelson, A. Mason-Jones, R. Prabhu, M. D. Jones

Abstract:

Motor vehicle related pedestrian road traffic collisions are a major road safety challenge, since they are a leading cause of death and serious injury worldwide, contributing to a third of the global disease burden. The auto rickshaw, which is a common form of urban transport in many developing countries, plays a major transport role, both as a vehicle for hire and for private use. The most common auto rickshaws are quite unlike ‘typical’ four-wheel motor vehicle, being typically characterised by three wheels, a non-tilting sheet-metal body or open frame construction, a canvas roof and side curtains, a small drivers’ cabin, handlebar controls and a passenger space at the rear. Given the propensity, in developing countries, for auto rickshaws to be used in mixed cityscapes, where pedestrians and vehicles share the roadway, the potential for auto rickshaw impacts with pedestrians is relatively high. Whilst auto rickshaws are used in some Western countries, their limited number and spatial separation from pedestrian walkways, as a result of city planning, has not resulted in significant accident statistics. Thus, auto rickshaws have not been subject to the vehicle impact related pedestrian crash kinematic analyses and/or injury mechanics assessment, typically associated with motor vehicle development in Western Europe, North America and Japan. This study presents a parametric analysis of auto rickshaw related pedestrian impacts by computational simulation, using a Finite Element model of an auto rickshaw and an LS-DYNA 50th percentile male Hybrid III Anthropometric Test Device (dummy). Parametric variables include auto rickshaw impact velocity, auto rickshaw impact region (front, centre or offset) and relative pedestrian impact position (front, side and rear). The output data of each impact simulation was correlated against reported injury metrics, Head Injury Criterion (front, side and rear), Neck injury Criterion (front, side and rear), Abbreviated Injury Scale and reported risk level and adds greater understanding to the issue of auto rickshaw related pedestrian injury risk. The parametric analyses suggest that pedestrians are subject to a relatively high risk of injury during impacts with an auto rickshaw at velocities of 20 km/h or greater, which during some of the impact simulations may even risk fatalities. The present study provides valuable evidence for informing a series of recommendations and guidelines for making the auto rickshaw safer during collisions with pedestrians. Whilst it is acknowledged that the present research findings are based in the field of safety engineering and may over represent injury risk, compared to “Real World” accidents, many of the simulated interactions produced injury response values significantly greater than current threshold curves and thus, justify their inclusion in the study. To reduce the injury risk level and increase the safety of the auto rickshaw, there should be a reduction in the velocity of the auto rickshaw and, or, consideration of engineering solutions, such as retro fitting injury mitigation technologies to those auto rickshaw contact regions which are the subject of the greatest risk of producing pedestrian injury.

Keywords: auto rickshaw, finite element analysis, injury risk level, LS-DYNA, pedestrian impact

Procedia PDF Downloads 194
1986 Evaluating Models Through Feature Selection Methods Using Data Driven Approach

Authors: Shital Patil, Surendra Bhosale

Abstract:

Cardiac diseases are the leading causes of mortality and morbidity in the world, from recent few decades accounting for a large number of deaths have emerged as the most life-threatening disorder globally. Machine learning and Artificial intelligence have been playing key role in predicting the heart diseases. A relevant set of feature can be very helpful in predicting the disease accurately. In this study, we proposed a comparative analysis of 4 different features selection methods and evaluated their performance with both raw (Unbalanced dataset) and sampled (Balanced) dataset. The publicly available Z-Alizadeh Sani dataset have been used for this study. Four feature selection methods: Data Analysis, minimum Redundancy maximum Relevance (mRMR), Recursive Feature Elimination (RFE), Chi-squared are used in this study. These methods are tested with 8 different classification models to get the best accuracy possible. Using balanced and unbalanced dataset, the study shows promising results in terms of various performance metrics in accurately predicting heart disease. Experimental results obtained by the proposed method with the raw data obtains maximum AUC of 100%, maximum F1 score of 94%, maximum Recall of 98%, maximum Precision of 93%. While with the balanced dataset obtained results are, maximum AUC of 100%, F1-score 95%, maximum Recall of 95%, maximum Precision of 97%.

Keywords: cardio vascular diseases, machine learning, feature selection, SMOTE

Procedia PDF Downloads 118
1985 Oryzanol Recovery from Rice Bran Oil: Adsorption Equilibrium Models Through Kinetics Data Approachments

Authors: A.D. Susanti, W. B. Sediawan, S.K. Wirawan, Budhijanto, Ritmaleni

Abstract:

Oryzanol content in rice bran oil (RBO) naturally has high antioxidant activity. Its reviewed has several health properties and high interested in pharmacy, cosmetics, and nutrition’s. Because of the low concentration of oryzanol in crude RBO (0.9-2.9%) then its need to be further processed for practical usage, such as via adsorption process. In this study, investigation and adjustment of adsorption equilibrium models were conducted through the kinetic data approachments. Mathematical modeling on kinetics of batch adsorption of oryzanol separation from RBO has been set-up and then applied for equilibrium results. The size of adsorbent particles used in this case are usually relatively small then the concentration in the adsorbent is assumed to be not different. Hence, the adsorption rate is controlled by the rate of oryzanol mass transfer from the bulk fluid of RBO to the surface of silica gel. In this approachments, the rate of mass transfer is assumed to be proportional to the concentration deviation from the equilibrium state. The equilibrium models applied were Langmuir, coefficient distribution, and Freundlich with the values of the parameters obtained from equilibrium results. It turned out that the models set-up can quantitatively describe the experimental kinetics data and the adjustment of the values of equilibrium isotherm parameters significantly improves the accuracy of the model. And then the value of mass transfer coefficient per unit adsorbent mass (kca) is obtained by curve fitting.

Keywords: adsorption equilibrium, adsorption kinetics, oryzanol, rice bran oil

Procedia PDF Downloads 323
1984 Image Multi-Feature Analysis by Principal Component Analysis for Visual Surface Roughness Measurement

Authors: Wei Zhang, Yan He, Yan Wang, Yufeng Li, Chuanpeng Hao

Abstract:

Surface roughness is an important index for evaluating surface quality, needs to be accurately measured to ensure the performance of the workpiece. The roughness measurement based on machine vision involves various image features, some of which are redundant. These redundant features affect the accuracy and speed of the visual approach. Previous research used correlation analysis methods to select the appropriate features. However, this feature analysis is independent and cannot fully utilize the information of data. Besides, blindly reducing features lose a lot of useful information, resulting in unreliable results. Therefore, the focus of this paper is on providing a redundant feature removal approach for visual roughness measurement. In this paper, the statistical methods and gray-level co-occurrence matrix(GLCM) are employed to extract the texture features of machined images effectively. Then, the principal component analysis(PCA) is used to fuse all extracted features into a new one, which reduces the feature dimension and maintains the integrity of the original information. Finally, the relationship between new features and roughness is established by the support vector machine(SVM). The experimental results show that the approach can effectively solve multi-feature information redundancy of machined surface images and provides a new idea for the visual evaluation of surface roughness.

Keywords: feature analysis, machine vision, PCA, surface roughness, SVM

Procedia PDF Downloads 212
1983 A Decision Support System to Detect the Lumbar Disc Disease on the Basis of Clinical MRI

Authors: Yavuz Unal, Kemal Polat, H. Erdinc Kocer

Abstract:

In this study, a decision support system comprising three stages has been proposed to detect the disc abnormalities of the lumbar region. In the first stage named the feature extraction, T2-weighted sagittal and axial Magnetic Resonance Images (MRI) were taken from 55 people and then 27 appearance and shape features were acquired from both sagittal and transverse images. In the second stage named the feature weighting process, k-means clustering based feature weighting (KMCBFW) proposed by Gunes et al. Finally, in the third stage named the classification process, the classifier algorithms including multi-layer perceptron (MLP- neural network), support vector machine (SVM), Naïve Bayes, and decision tree have been used to classify whether the subject has lumbar disc or not. In order to test the performance of the proposed method, the classification accuracy (%), sensitivity, specificity, precision, recall, f-measure, kappa value, and computation times have been used. The best hybrid model is the combination of k-means clustering based feature weighting and decision tree in the detecting of lumbar disc disease based on both sagittal and axial MR images.

Keywords: lumbar disc abnormality, lumbar MRI, lumbar spine, hybrid models, hybrid features, k-means clustering based feature weighting

Procedia PDF Downloads 520
1982 Dynamic Distribution Calibration for Improved Few-Shot Image Classification

Authors: Majid Habib Khan, Jinwei Zhao, Xinhong Hei, Liu Jiedong, Rana Shahzad Noor, Muhammad Imran

Abstract:

Deep learning is increasingly employed in image classification, yet the scarcity and high cost of labeled data for training remain a challenge. Limited samples often lead to overfitting due to biased sample distribution. This paper introduces a dynamic distribution calibration method for few-shot learning. Initially, base and new class samples undergo normalization to mitigate disparate feature magnitudes. A pre-trained model then extracts feature vectors from both classes. The method dynamically selects distribution characteristics from base classes (both adjacent and remote) in the embedding space, using a threshold value approach for new class samples. Given the propensity of similar classes to share feature distributions like mean and variance, this research assumes a Gaussian distribution for feature vectors. Subsequently, distributional features of new class samples are calibrated using a corrected hyperparameter, derived from the distribution features of both adjacent and distant base classes. This calibration augments the new class sample set. The technique demonstrates significant improvements, with up to 4% accuracy gains in few-shot classification challenges, as evidenced by tests on miniImagenet and CUB datasets.

Keywords: deep learning, computer vision, image classification, few-shot learning, threshold

Procedia PDF Downloads 66