Search results for: clinical cases
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1494

Search results for: clinical cases

504 The Role of the Injured Party's Fault in the Apportionment of Damages in Tort Law: A Comparative-Historical Study between Common Law and Islamic Law

Authors: Alireza Tavakolinia

Abstract:

In order to understand the role of the injured party's fault in dividing liability, we studied its historical background. In common law, the traditional contributory negligence rule was a complete defense. Then the legislature and judicial procedure modified that rule to one of apportionment. In Islamic law, too, the Action rule was at first used when the injured party was the sole cause, but jurists expanded the scope of this rule, so this rule was used in cases where both the injured party's fault and that of the other party are involved. There are some popular approaches for apportionment of damages. Some common law countries like Britain had chosen ‘the causal potency approach’ and ‘fixed apportionment’. Islamic countries like Iran have chosen both ‘the relative blameworthiness’ and ‘equal apportionment’ approaches. The article concludes that both common law and Islamic law believe in the division of responsibility between a wrongdoer claimant and the defendant. In contrast, in the apportionment of responsibility, Islamic law mostly believes in equal apportionment that is way easier and saves time and money, but common law legal systems have chosen the causal potency approach which is more complicated than the rival approach but is fairer.

Keywords: Contributory negligence, common law, Islamic Law, Tort Law.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 818
503 Comparison of Three Turbulence Models in Wear Prediction of Multi-Size Particulate Flow through Rotating Channel

Authors: Pankaj K. Gupta, Krishnan V. Pagalthivarthi

Abstract:

The present work compares the performance of three turbulence modeling approach (based on the two-equation k -ε model) in predicting erosive wear in multi-size dense slurry flow through rotating channel. All three turbulence models include rotation modification to the production term in the turbulent kineticenergy equation. The two-phase flow field obtained numerically using Galerkin finite element methodology relates the local flow velocity and concentration to the wear rate via a suitable wear model. The wear models for both sliding wear and impact wear mechanisms account for the particle size dependence. Results of predicted wear rates using the three turbulence models are compared for a large number of cases spanning such operating parameters as rotation rate, solids concentration, flow rate, particle size distribution and so forth. The root-mean-square error between FE-generated data and the correlation between maximum wear rate and the operating parameters is found less than 2.5% for all the three models.

Keywords: Rotating channel, maximum wear rate, multi-sizeparticulate flow, k −ε turbulence models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1757
502 A Hybrid Metaheuristic Framework for Evolving the PROAFTN Classifier

Authors: Feras Al-Obeidat, Nabil Belacel, Juan A. Carretero, Prabhat Mahanti,

Abstract:

In this paper, a new learning algorithm based on a hybrid metaheuristic integrating Differential Evolution (DE) and Reduced Variable Neighborhood Search (RVNS) is introduced to train the classification method PROAFTN. To apply PROAFTN, values of several parameters need to be determined prior to classification. These parameters include boundaries of intervals and relative weights for each attribute. Based on these requirements, the hybrid approach, named DEPRO-RVNS, is presented in this study. In some cases, the major problem when applying DE to some classification problems was the premature convergence of some individuals to local optima. To eliminate this shortcoming and to improve the exploration and exploitation capabilities of DE, such individuals were set to iteratively re-explored using RVNS. Based on the generated results on both training and testing data, it is shown that the performance of PROAFTN is significantly improved. Furthermore, the experimental study shows that DEPRO-RVNS outperforms well-known machine learning classifiers in a variety of problems.

Keywords: Knowledge Discovery, Differential Evolution, Reduced Variable Neighborhood Search, Multiple criteria classification, PROAFTN, Supervised Learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1463
501 MIMO-OFDM Channel Tracking Using a Dynamic ANN Topology

Authors: Manasjyoti Bhuyan, Kandarpa Kumar Sarma

Abstract:

All the available algorithms for blind estimation namely constant modulus algorithm (CMA), Decision-Directed Algorithm (DDA/DFE) suffer from the problem of convergence to local minima. Also, if the channel drifts considerably, any DDA looses track of the channel. So, their usage is limited in varying channel conditions. The primary limitation in such cases is the requirement of certain overhead bits in the transmit framework which leads to wasteful use of the bandwidth. Also such arrangements fail to use channel state information (CSI) which is an important aid in improving the quality of reception. In this work, the main objective is to reduce the overhead imposed by the pilot symbols, which in effect reduces the system throughput. Also we formulate an arrangement based on certain dynamic Artificial Neural Network (ANN) topologies which not only contributes towards the lowering of the overhead but also facilitates the use of the CSI. A 2×2 Multiple Input Multiple Output (MIMO) system is simulated and the performance variation with different channel estimation schemes are evaluated. A new semi blind approach based on dynamic ANN is proposed for channel tracking in varying channel conditions and the performance is compared with perfectly known CSI and least square (LS) based estimation.

Keywords: MIMO, Artificial Neural Network (ANN), CMA, LS, CSI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2347
500 Development of Fake News Model Using Machine Learning through Natural Language Processing

Authors: Sajjad Ahmed, Knut Hinkelmann, Flavio Corradini

Abstract:

Fake news detection research is still in the early stage as this is a relatively new phenomenon in the interest raised by society. Machine learning helps to solve complex problems and to build AI systems nowadays and especially in those cases where we have tacit knowledge or the knowledge that is not known. We used machine learning algorithms and for identification of fake news; we applied three classifiers; Passive Aggressive, Naïve Bayes, and Support Vector Machine. Simple classification is not completely correct in fake news detection because classification methods are not specialized for fake news. With the integration of machine learning and text-based processing, we can detect fake news and build classifiers that can classify the news data. Text classification mainly focuses on extracting various features of text and after that incorporating those features into classification. The big challenge in this area is the lack of an efficient way to differentiate between fake and non-fake due to the unavailability of corpora. We applied three different machine learning classifiers on two publicly available datasets. Experimental analysis based on the existing dataset indicates a very encouraging and improved performance.

Keywords: Fake news detection, types of fake news, machine learning, natural language processing, classification techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1486
499 Marketing Mix for Tourism in the Chonburi Province

Authors: Pisit Potjanajaruwit

Abstract:

The objectives of the study were to determine the marketing mix factors that influencing tourist’s destination decision making for cultural tourism in the Chonburi province. Both quantitative and qualitative data were applied in this study. The samples of 400 cases for quantitative analysis were tourists (both Thai and foreign) who were interested in cultural tourism in the Chonburi province, and traveled to cultural sites in Chonburi and 14 representatives from provincial tourism committee of Chonburi and local tourism experts. Statistics utilized in this research included frequency, percentage, mean, standard deviation, and multiple regression analysis. The study found that Thai and foreign tourists are influenced by different important marketing mix factors. The important factors for Thai respondents were physical evidence, price, people, and place at high importance level. For foreign respondents, physical evidence, price, people, and process were high importance level, whereas, product, place and promotion were moderate importance level.

Keywords: Chonburi Province, Decision Making for cultural tourism, Marketing Mixed.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3581
498 Distributed Estimation Using an Improved Incremental Distributed LMS Algorithm

Authors: Amir Rastegarnia, Mohammad Ali Tinati, Azam Khalili

Abstract:

In this paper we consider the problem of distributed adaptive estimation in wireless sensor networks for two different observation noise conditions. In the first case, we assume that there are some sensors with high observation noise variance (noisy sensors) in the network. In the second case, different variance for observation noise is assumed among the sensors which is more close to real scenario. In both cases, an initial estimate of each sensor-s observation noise is obtained. For the first case, we show that when there are such sensors in the network, the performance of conventional distributed adaptive estimation algorithms such as incremental distributed least mean square (IDLMS) algorithm drastically decreases. In addition, detecting and ignoring these sensors leads to a better performance in a sense of estimation. In the next step, we propose a simple algorithm to detect theses noisy sensors and modify the IDLMS algorithm to deal with noisy sensors. For the second case, we propose a new algorithm in which the step-size parameter is adjusted for each sensor according to its observation noise variance. As the simulation results show, the proposed methods outperforms the IDLMS algorithm in the same condition.

Keywords: Distributes estimation, sensor networks, adaptive filter, IDLMS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1425
497 The Influence of Zeolitic Spent Refinery Admixture on the Rheological and Technological Properties of Steel Fiber Reinforced Self-Compacting Concrete

Authors: Ž. Rudžionis, P. Grigaliūnas, D. Vaičiukynienė

Abstract:

By planning this experimental work to investigate the effect of zeolitic waste on rheological and technological properties of self-compacting fiber reinforced concrete, we had an intention to draw attention to the environmental factor. Large amount of zeolitic waste, as secondary raw materials are not in use properly and large amount of it is collected without a clear view of its usage in future. The principal aim of this work is to assure, that zeolitic waste admixture takes positive effect to the self-compacting fiber reinforced concrete mixes stability, flowability and other properties by using the experimental research methods. In addition to that a research on cement and zeolitic waste mortars were implemented to clarify the effect of zeolitic waste on properties of cement paste and stone. Primary studies indicates that zeolitic waste characterizes clear pozzolanic behavior, do not deteriorate and in some cases ensure positive rheological and mechanical characteristics of self-compacting concrete mixes.

Keywords: Self compacting concrete, steel fiber reinforced concrete, zeolitic waste, rheological properties of concrete, slump flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1764
496 Hydraulic Studies on Core Components of PFBR

Authors: G. K. Pandey, D. Ramadasu, I. Banerjee, V. Vinod, G. Padmakumar, V. Prakash, K. K. Rajan

Abstract:

Detailed thermal hydraulic investigations are very  essential for safe and reliable functioning of liquid metal cooled fast  breeder reactors. These investigations are further more important for  components with complex profile, since there is no direct correlation  available in literature to evaluate the hydraulic characteristics of such  components directly. In those cases available correlations for similar  profile or geometries may lead to significant uncertainty in the  outcome. Hence experimental approach can be adopted to evaluate  these hydraulic characteristics more precisely for better prediction in  reactor core components.  Prototype Fast Breeder Reactor (PFBR), a sodium cooled pool  type reactor is under advanced stage of construction at Kalpakkam,  India. Several components of this reactor core require hydraulic  investigation before its usage in the reactor. These hydraulic  investigations on full scale models, carried out by experimental  approaches using water as simulant fluid are discussed in the paper. 

Keywords: Fast Breeder Reactor, Cavitation, pressure drop, Reactor components.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2913
495 A Simple Epidemiological Model for Typhoid with Saturated Incidence Rate and Treatment Effect

Authors: Steady Mushayabasa

Abstract:

Typhoid fever is a communicable disease, found only in man and occurs due to systemic infection mainly by Salmonella typhi organism. The disease is endemic in many developing countries and remains a substantial public health problem despite recent progress in water and sanitation coverage. Globally, it is estimated that typhoid causes over 16 million cases of illness each year, resulting in over 600,000 deaths. A mathematical model for assessing the impact of educational campaigns on controlling the transmission dynamics of typhoid in the community, has been formulated and analyzed. The reproductive number has been computed. Stability of the model steady-states has been examined. The impact of educational campaigns on controlling the transmission dynamics of typhoid has been discussed through the basic reproductive number and numerical simulations. At its best the study suggests that targeted education campaigns, which are effective at stopping transmission of typhoid more than 40% of the time, will be highly effective at controlling the disease in the community. 

Keywords: Mathematical model, Typhoid, saturated incidence rate, treatment, reproductive number, sensitivity analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3497
494 Pathological Truth: The Use of Forensic Science in Kenya’s Criminal Justice System

Authors: Peter Ndichu Muriuki

Abstract:

Assassination of politicians, school mass murders, purported suicides, aircraft crash, mass shootings by police, sinking of sea ferries, mysterious car accidents, mass fire deaths and horrificterror attacks are some of the cases that bring forth scientific and legal conflicts. Questions about truth, justice and human rights are raised by both victims and perpetrators/offenders as they seek to understand why and how it happened to them. This kind of questioning manifests itself in medical-criminological-legalpsychological and scientific realms. An agreement towards truthinvestigations for possible legal-political-psychological transitory issues such as prosecution, victim-offender mediation, healing, reconciliation, amnesty, reparation, restitution, and policy formulations is seen as one way of transforming these conflicts. Forensic scientists and pathologists in particular have formed professional groups where the complexities between legal truth and scientific truth are dramatized and elucidated within the anatomy of courtrooms. This paper focuses on how pathological truth and legal truth interact with each other in Kenya’s criminal justice system. 

Keywords: Forensic pathology, forensic science, pathological truth, truth investigations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4869
493 Design for Safety: Safety Consideration in Planning and Design of Airport Airsides

Authors: Maithem Al-Saadi, Min An

Abstract:

During airport planning and design stages, the major issues of capacity and safety in construction and operation of an airport need to be taken into consideration. The airside of an airport is a major and critical infrastructure that usually consists of runway(s), taxiway system, and apron(s) etc., which have to be designed according to the international standards and recommendations, and local limitations to accommodate the forecasted demands. However, in many cases, airport airsides are suffering from unexpected risks that occurred during airport operations. Therefore, safety risk assessment should be applied in the planning and design of airsides to cope with the probability of risks and their consequences, and to make decisions to reduce the risks to as low as reasonably practicable (ALARP) based on safety risk assessment. This paper presents a combination approach of Failure Modes, Effect, and Criticality Analysis (FMECA), Fuzzy Reasoning Approach (FRA), and Fuzzy Analytic Hierarchy Process (FAHP) to develop a risk analysis model for safety risk assessment. An illustrated example is used to the demonstrate risk assessment process on how the design of an airside in an airport can be analysed by using the proposed safety design risk assessment model.

Keywords: Airport airside planning and design, design for safety, fuzzy reasoning approach, fuzzy AHP, risk assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2141
492 Investigations of Free-to-Roll Motions and its Active Control under Pitch-up Maneuvers

Authors: Tanveer A. Khan, Xue Y. Deng, Yan K. Wang, Xu Si-Wen

Abstract:

Experiments have been carried out at sub-critical Reynolds number to investigate free-to-roll motions induced by forebody and/or wings complex flow on a 30° swept back nonslender wings-slender body-model for static and dynamic (pitch-up) cases. For the dynamic (pitch-up) case it has been observed that roll amplitude decreases and lag increases with increase in pitching speed. Decrease in roll amplitude with increase in pitch rate is attributed to low disturbing rolling moment due to weaker interaction between forebody and wing flow components. Asymmetric forebody vortices dominate and control the roll motion of the model in dynamic case when non-dimensional pitch rate ≥ 1x10-2. Effectiveness of the active control scheme utilizing rotating nose with artificial tip perturbation is observed to be low in the angle of attack region where the complex flow over the wings has contributions from both forebody and wings.

Keywords: Artificial Tip Perturbation, ExperimentalInvestigations, Forebody Asymmetric Vortices, Non-slender Wings-Body Model, Wing Rock

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1501
491 Numerical Simulation on Heat Transfer Enhancement in Channel by Triangular Ribs

Authors: Tuqa Abdulrazzaq, Hussein Togun, M. K. A. Ariffin, S. N. Kazi, NM Adam, S. Masuri

Abstract:

Turbulent heat transfer to fluid flow through channel with triangular ribs of different angles are presented in this paper. Ansys 14 ICEM and Ansys 14 Fluent are used for meshing process and solving Navier stokes equations respectively. In this investigation three angles of triangular ribs with the range of Reynolds number varied from 20000 to 60000 at constant surface temperature are considered. The results show that the Nusselt number increases with the increase of Reynolds number for all cases at constant surface temperature. According to the profile of local Nusselt number on ribs walled of channel, the peak is at the midpoint between the two ribs. The maximum value of average Nusselt number is obtained for triangular ribs of angel 60°and at Reynolds number of 60000 compared to the Nusselt number for the ribs of angel 90° and 45° and at same Reynolds number. The recirculation regions generated by the ribs corresponding to the velocity streamline show the largest recirculation region at triangular ribs of angle 60° which also provides the highest enhancement of heat transfer.

Keywords: Ribs channel, Turbulent flow, Heat transfer enhancement, Recirculation flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3190
490 Simulation of Reflection Loss for Carbon and Nickel-Carbon Thin Films

Authors: M. Emami, R. Tarighi, R. Goodarzi

Abstract:

Maximal radar wave absorbing cannot be achieved by shaping alone. We have to focus on the parameters of absorbing materials such as permittivity, permeability, and thickness so that best absorbing according to our necessity can happen. The real and imaginary parts of the relative complex permittivity (εr' and εr") and permeability (µr' and µr") were obtained by simulation. The microwave absorbing property of carbon and Ni(C) is simulated in this study by MATLAB software; the simulation was in the frequency range between 2 to 12 GHz for carbon black (C), and carbon coated nickel (Ni(C)) with different thicknesses. In fact, we draw reflection loss (RL) for C and Ni-C via frequency. We have compared their absorption for 3-mm thickness and predicted for other thicknesses by using of electromagnetic wave transmission theory. The results showed that reflection loss position changes in low frequency with increasing of thickness. We found out that, in all cases, using nanocomposites as absorbance cannot get better results relative to pure nanoparticles. The frequency where absorption is maximum can determine the best choice between nanocomposites and pure nanoparticles. Also, we could find an optimal thickness for long wavelength absorbing in order to utilize them in protecting shields and covering.

Keywords: Absorbing, carbon, carbon nickel, frequency, thicknesses.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 884
489 The Contribution of the PCR-Enzymatic Digestion in the Positive Diagnosis of Proximal Spinal Muscular Atrophy in the Moroccan Population

Authors: H. Merhni, A. Sbiti, I. Ratbi, A. Sefiani

Abstract:

The proximal spinal muscular atrophy (SMA) is a group of neuromuscular disorders characterized by progressive muscle weakness due to the degeneration and loss of anterior motor neurons of the spinal cord. Depending on the age of onset of symptoms and their evolution, four types of SMA, varying in severity, result in a mutations of the SMN gene (survival of Motor neuron). We have analyzed the DNA of 295 patients referred to our genetic counseling; since January 1996 until October 2014; for suspected SMA. The homozygous deletion of exon 7 of the SMN gene was found in 133 patients; of which, 40.6% were born to consanguineous parents. In countries like Morocco, where the frequency of heterozygotes for SMA is high, genetic testing should be offered as first-line and, after careful clinical assessment, especially in newborns and infants with congenital hypotonia unexplained and prognosis compromise. The molecular diagnosis of SMA allows a quick and certainly diagnosis, provide adequate genetic counseling for families at risk and suggest, for couples who want prenatal diagnosis. The analysis of the SMN gene is a perfect example of genetic testing with an excellent cost/benefit ratio that can be of great interest in public health, especially in low-income countries. We emphasize in this work for the benefit of the generalization of molecular diagnosis of SMA by the technique of PCR-enzymatic digestion in other centers in Morocco.

Keywords: Exon7, PCR-digestion, SMA, SMN gene.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1100
488 The Use of Fractional Brownian Motion in the Generation of Bed Topography for Bodies of Water Coupled with the Lattice Boltzmann Method

Authors: Elysia Barker, Jian Guo Zhou, Ling Qian, Steve Decent

Abstract:

A method of modelling topography used in the simulation of riverbeds is proposed in this paper which removes the need for datapoints and measurements of a physical terrain. While complex scans of the contours of a surface can be achieved with other methods, this requires specialised tools which the proposed method overcomes by using fractional Brownian motion (FBM) as a basis to estimate the real surface within a 15% margin of error while attempting to optimise algorithmic efficiency. This removes the need for complex, expensive equipment and reduces resources spent modelling bed topography. This method also accounts for the change in topography over time due to erosion, sediment transport, and other external factors which could affect the topography of the ground by updating its parameters and generating a new bed. The lattice Boltzmann method (LBM) is used to simulate both stationary and steady flow cases in a side-by-side comparison over the generated bed topography using the proposed method, and a test case taken from an external source. The method, if successful, will be incorporated into the current LBM program used in the testing phase, which will allow an automatic generation of topography for the given situation in future research, removing the need for bed data to be specified.

Keywords: Bed topography, FBM, LBM, shallow water, simulations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 272
487 The Role of Velocity Map Quality in Estimation of Intravascular Pressure Distribution

Authors: Ali Pashaee, Parisa Shooshtari, Gholamreza Atae, Nasser Fatouraee

Abstract:

Phase-Contrast MR imaging methods are widely used for measurement of blood flow velocity components. Also there are some other tools such as CT and Ultrasound for velocity map detection in intravascular studies. These data are used in deriving flow characteristics. Some clinical applications are investigated which use pressure distribution in diagnosis of intravascular disorders such as vascular stenosis. In this paper an approach to the problem of measurement of intravascular pressure field by using velocity field obtained from flow images is proposed. The method presented in this paper uses an algorithm to calculate nonlinear equations of Navier- Stokes, assuming blood as an incompressible and Newtonian fluid. Flow images usually suffer the lack of spatial resolution. Our attempt is to consider the effect of spatial resolution on the pressure distribution estimated from this method. In order to achieve this aim, velocity map of a numerical phantom is derived at six different spatial resolutions. To determine the effects of vascular stenoses on pressure distribution, a stenotic phantom geometry is considered. A comparison between the pressure distribution obtained from the phantom and the pressure resulted from the algorithm is presented. In this regard we also compared the effects of collocated and staggered computational grids on the pressure distribution resulted from this algorithm.

Keywords: Flow imaging, pressure distribution estimation, phantom, resolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1661
486 Missing Link Data Estimation with Recurrent Neural Network: An Application Using Speed Data of Daegu Metropolitan Area

Authors: JaeHwan Yang, Da-Woon Jeong, Seung-Young Kho, Dong-Kyu Kim

Abstract:

In terms of ITS, information on link characteristic is an essential factor for plan or operation. But in practical cases, not every link has installed sensors on it. The link that does not have data on it is called “Missing Link”. The purpose of this study is to impute data of these missing links. To get these data, this study applies the machine learning method. With the machine learning process, especially for the deep learning process, missing link data can be estimated from present link data. For deep learning process, this study uses “Recurrent Neural Network” to take time-series data of road. As input data, Dedicated Short-range Communications (DSRC) data of Dalgubul-daero of Daegu Metropolitan Area had been fed into the learning process. Neural Network structure has 17 links with present data as input, 2 hidden layers, for 1 missing link data. As a result, forecasted data of target link show about 94% of accuracy compared with actual data.

Keywords: Data Estimation, link data, machine learning, road network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1480
485 On the Variability of Tool Wear and Life at Disparate Operating Parameters

Authors: S. E. Oraby, A.M. Alaskari

Abstract:

The stochastic nature of tool life using conventional discrete-wear data from experimental tests usually exists due to many individual and interacting parameters. It is a common practice in batch production to continually use the same tool to machine different parts, using disparate machining parameters. In such an environment, the optimal points at which tools have to be changed, while achieving minimum production cost and maximum production rate within the surface roughness specifications, have not been adequately studied. In the current study, two relevant aspects are investigated using coated and uncoated inserts in turning operations: (i) the accuracy of using machinability information, from fixed parameters testing procedures, when variable parameters situations are emerged, and (ii) the credibility of tool life machinability data from prior discrete testing procedures in a non-stop machining. A novel technique is proposed and verified to normalize the conventional fixed parameters machinability data to suit the cases when parameters have to be changed for the same tool. Also, an experimental investigation has been established to evaluate the error in the tool life assessment when machinability from discrete testing procedures is employed in uninterrupted practical machining.

Keywords: Machinability, tool life, tool wear, wear variability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1779
484 A Comparison of Some Thresholding Selection Methods for Wavelet Regression

Authors: Alsaidi M. Altaher, Mohd T. Ismail

Abstract:

In wavelet regression, choosing threshold value is a crucial issue. A too large value cuts too many coefficients resulting in over smoothing. Conversely, a too small threshold value allows many coefficients to be included in reconstruction, giving a wiggly estimate which result in under smoothing. However, the proper choice of threshold can be considered as a careful balance of these principles. This paper gives a very brief introduction to some thresholding selection methods. These methods include: Universal, Sure, Ebays, Two fold cross validation and level dependent cross validation. A simulation study on a variety of sample sizes, test functions, signal-to-noise ratios is conducted to compare their numerical performances using three different noise structures. For Gaussian noise, EBayes outperforms in all cases for all used functions while Two fold cross validation provides the best results in the case of long tail noise. For large values of signal-to-noise ratios, level dependent cross validation works well under correlated noises case. As expected, increasing both sample size and level of signal to noise ratio, increases estimation efficiency.

Keywords: wavelet regression, simulation, Threshold.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1753
483 Evaluation of a Hybrid Knowledge-Based System Using Fuzzy Approach

Authors: Kamalendu Pal

Abstract:

This paper describes the main features of a knowledge-based system evaluation method. System evaluation is placed in the context of a hybrid legal decision-support system, Advisory Support for Home Settlement in Divorce (ASHSD). Legal knowledge for ASHSD is represented in two forms, as rules and previously decided cases. Besides distinguishing the two different forms of knowledge representation, the paper outlines the actual use of these forms in a computational framework that is designed to generate a plausible solution for a given case, by using rule-based reasoning (RBR) and case-based reasoning (CBR) in an integrated environment. The nature of suitability assessment of a solution has been considered as a multiple criteria decision-making process in ASHAD evaluation. The evaluation was performed by a combination of discussions and questionnaires with different user groups. The answers to questionnaires used in this evaluations method have been measured as a fuzzy linguistic term. The finding suggests that fuzzy linguistic evaluation is practical and meaningful in knowledge-based system development purpose. 

Keywords: Case-based reasoning, decision-support system, fuzzy linguistic term, rule-based reasoning, system evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1617
482 Analysis of GI/M(n)/1/N Queue with Single Working Vacation and Vacation Interruption

Authors: P. Vijaya Laxmi, V. Goswami, V. Suchitra

Abstract:

This paper presents a finite buffer renewal input single working vacation and vacation interruption queue with state dependent services and state dependent vacations, which has a wide range of applications in several areas including manufacturing, wireless communication systems. Service times during busy period, vacation period and vacation times are exponentially distributed and are state dependent. As a result of the finite waiting space, state dependent services and state dependent vacation policies, the analysis of these queueing models needs special attention. We provide a recursive method using the supplementary variable technique to compute the stationary queue length distributions at pre-arrival and arbitrary epochs. An efficient computational algorithm of the model is presented which is fast and accurate and easy to implement. Various performance measures have been discussed. Finally, some special cases and numerical results have been depicted in the form of tables and graphs. 

Keywords: State Dependent Service, Vacation Interruption, Supplementary Variable, Single Working Vacation, Blocking Probability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2135
481 Evaluation of the Hepatitis C Virus and Classical and Modern Immunoassays Used Nowadays to Diagnose It in Tirana

Authors: Stela Papa, Klementina Puto, Migena Pllaha

Abstract:

HCV is a hepatotropic RNA virus, transmitted primarily via the blood route, which causes progressive disease such as chronic hepatitis, liver cirrhosis, or hepatocellular carcinoma. HCV nowadays is a global healthcare problem. A variety of immunoassays including old and new technologies are being applied to detect HCV in our country. These methods include Immunochromatography assays (ICA), Fluorescence immunoassay (FIA), Enzyme linked fluorescent assay (ELFA), and Enzyme linked immunosorbent assay (ELISA) to detect HCV antibodies in blood serum, which lately is being slowly replaced by more sensitive methods such as rapid automated analyzer chemiluminescence immunoassay (CLIA). The aim of this study is to estimate HCV infection in carriers and chronic acute patients and to evaluate the use of new diagnostic methods. This study was realized from September 2016 to May 2018. During this study period, 2913 patients were analyzed for the presence of HCV by taking samples from their blood serum. The immunoassays performed were ICA, FIA, ELFA, ELISA, and CLIA assays. Concluding, 82% of patients taken in this study, resulted infected with HCV. Diagnostic methods in clinical laboratories are crucial in the early stages of infection, in the management of chronic hepatitis and in the treatment of patients during their disease.

Keywords: CLIA, ELISA, hepatitis C virus, immunoassay.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 711
480 Mining Genes Relations in Microarray Data Combined with Ontology in Colon Cancer Automated Diagnosis System

Authors: A. Gruzdz, A. Ihnatowicz, J. Siddiqi, B. Akhgar

Abstract:

MATCH project [1] entitle the development of an automatic diagnosis system that aims to support treatment of colon cancer diseases by discovering mutations that occurs to tumour suppressor genes (TSGs) and contributes to the development of cancerous tumours. The constitution of the system is based on a) colon cancer clinical data and b) biological information that will be derived by data mining techniques from genomic and proteomic sources The core mining module will consist of the popular, well tested hybrid feature extraction methods, and new combined algorithms, designed especially for the project. Elements of rough sets, evolutionary computing, cluster analysis, self-organization maps and association rules will be used to discover the annotations between genes, and their influence on tumours [2]-[11]. The methods used to process the data have to address their high complexity, potential inconsistency and problems of dealing with the missing values. They must integrate all the useful information necessary to solve the expert's question. For this purpose, the system has to learn from data, or be able to interactively specify by a domain specialist, the part of the knowledge structure it needs to answer a given query. The program should also take into account the importance/rank of the particular parts of data it analyses, and adjusts the used algorithms accordingly.

Keywords: Bioinformatics, gene expression, ontology, selforganizingmaps.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1958
479 Understanding Evolutionary Algorithms through Interactive Graphical Applications

Authors: Javier Barrachina, Piedad Garrido, Manuel Fogue, Julio A. Sanguesa, Francisco J. Martinez

Abstract:

It is very common to observe, especially in Computer Science studies that students have difficulties to correctly understand how some mechanisms based on Artificial Intelligence work. In addition, the scope and limitations of most of these mechanisms are usually presented by professors only in a theoretical way, which does not help students to understand them adequately. In this work, we focus on the problems found when teaching Evolutionary Algorithms (EAs), which imitate the principles of natural evolution, as a method to solve parameter optimization problems. Although this kind of algorithms can be very powerful to solve relatively complex problems, students often have difficulties to understand how they work, and how to apply them to solve problems in real cases. In this paper, we present two interactive graphical applications which have been specially designed with the aim of making Evolutionary Algorithms easy to be understood by students. Specifically, we present: (i) TSPS, an application able to solve the ”Traveling Salesman Problem”, and (ii) FotEvol, an application able to reconstruct a given image by using Evolution Strategies. The main objective is that students learn how these techniques can be implemented, and the great possibilities they offer.

Keywords: Education, evolutionary algorithms, evolution strategies, interactive learning applications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1045
478 The Ability of Forecasting the Term Structure of Interest Rates Based On Nelson-Siegel and Svensson Model

Authors: Tea Poklepović, Zdravka Aljinović, Branka Marasović

Abstract:

Due to the importance of yield curve and its estimation it is inevitable to have valid methods for yield curve forecasting in cases when there are scarce issues of securities and/or week trade on a secondary market. Therefore in this paper, after the estimation of weekly yield curves on Croatian financial market from October 2011 to August 2012 using Nelson-Siegel and Svensson models, yield curves are forecasted using Vector autoregressive model and Neural networks. In general, it can be concluded that both forecasting methods have good prediction abilities where forecasting of yield curves based on Nelson Siegel estimation model give better results in sense of lower Mean Squared Error than forecasting based on Svensson model Also, in this case Neural networks provide slightly better results. Finally, it can be concluded that most appropriate way of yield curve prediction is Neural networks using Nelson-Siegel estimation of yield curves.

Keywords: Nelson-Siegel model, Neural networks, Svensson model, Vector autoregressive model, Yield curve.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3230
477 Impact of Increasing Distributed Solar PV Systems on Distribution Networks in South Africa

Authors: Aradhna Pandarum

Abstract:

South Africa is experiencing an exponential growth of distributed solar PV installations. This is due to various factors with the predominant one being increasing electricity tariffs along with decreasing installation costs, resulting in attractive business cases to some end-users. Despite there being a variety of economic and environmental advantages associated with the installation of PV, their potential impact on distribution grids has yet to be thoroughly investigated. This is especially true since the locations of these units cannot be controlled by Network Service Providers (NSPs) and their output power is stochastic and non-dispatchable. This report details two case studies that were completed to determine the possible voltage and technical losses impact of increasing PV penetration in the Northern Cape of South Africa. Some major impacts considered for the simulations were ramping of PV generation due to intermittency caused by moving clouds, the size and overall hosting capacity and the location of the systems. The main finding is that the technical impact is different on a constrained feeder vs a non-constrained feeder. The acceptable PV penetration level is much lower for a constrained feeder than a non-constrained feeder, depending on where the systems are located.

Keywords: Medium voltage networks, power system losses, power system voltage, solar photovoltaic, PV.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 520
476 Response of Fully Backed Sandwich Beams to Low Velocity Transverse Impact

Authors: M. Sadighi, H. Pouriayevali, M. Saadati

Abstract:

This paper describes analysis of low velocity transverse impact on fully backed sandwich beams with composite faces from Eglass/epoxy and cores from Polyurethane or PVC. Indentation on sandwich beams has been analyzed with the existing theories and modeled with the FE code ABAQUS, also loadings have been done experimentally to verify theoretical results. Impact on fully backed has been modeled in two cases of impactor energy with SDOF model (single-degree-of-freedom) and indentation stiffness: lower energy for elastic indentation of sandwich beams and higher energy for plastic area in indentation. Impacts have been modeled by ABAQUS. Impact results can describe response of beam in terms of core and faces thicknesses, core material, indentor energy and energy absorbed. The foam core is modeled using the crushable foam material model and response of the foam core is experimentally characterized in uniaxial compression with higher velocity loading to define quasi impact behaviour.

Keywords: Low velocity impact, fully backed, indentation, sandwich beams, foams, finite element.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1780
475 Adjusted Ratio and Regression Type Estimators for Estimation of Population Mean when some Observations are missing

Authors: Nuanpan Nangsue

Abstract:

Ratio and regression type estimators have been used by previous authors to estimate a population mean for the principal variable from samples in which both auxiliary x and principal y variable data are available. However, missing data are a common problem in statistical analyses with real data. Ratio and regression type estimators have also been used for imputing values of missing y data. In this paper, six new ratio and regression type estimators are proposed for imputing values for any missing y data and estimating a population mean for y from samples with missing x and/or y data. A simulation study has been conducted to compare the six ratio and regression type estimators with a previous estimator of Rueda. Two population sizes N = 1,000 and 5,000 have been considered with sample sizes of 10% and 30% and with correlation coefficients between population variables X and Y of 0.5 and 0.8. In the simulations, 10 and 40 percent of sample y values and 10 and 40 percent of sample x values were randomly designated as missing. The new ratio and regression type estimators give similar mean absolute percentage errors that are smaller than the Rueda estimator for all cases. The new estimators give a large reduction in errors for the case of 40% missing y values and sampling fraction of 30%.

Keywords: Auxiliary variable, missing data, ratio and regression type estimators.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715