Search results for: preventive methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15397

Search results for: preventive methods

14347 Numerical Studies for Standard Bi-Conjugate Gradient Stabilized Method and the Parallel Variants for Solving Linear Equations

Authors: Kuniyoshi Abe

Abstract:

Bi-conjugate gradient (Bi-CG) is a well-known method for solving linear equations Ax = b, for x, where A is a given n-by-n matrix, and b is a given n-vector. Typically, the dimension of the linear equation is high and the matrix is sparse. A number of hybrid Bi-CG methods such as conjugate gradient squared (CGS), Bi-CG stabilized (Bi-CGSTAB), BiCGStab2, and BiCGstab(l) have been developed to improve the convergence of Bi-CG. Bi-CGSTAB has been most often used for efficiently solving the linear equation, but we have seen the convergence behavior with a long stagnation phase. In such cases, it is important to have Bi-CG coefficients that are as accurate as possible, and the stabilization strategy, which stabilizes the computation of the Bi-CG coefficients, has been proposed. It may avoid stagnation and lead to faster computation. Motivated by a large number of processors in present petascale high-performance computing hardware, the scalability of Krylov subspace methods on parallel computers has recently become increasingly prominent. The main bottleneck for efficient parallelization is the inner products which require a global reduction. The resulting global synchronization phases cause communication overhead on parallel computers. The parallel variants of Krylov subspace methods reducing the number of global communication phases and hiding the communication latency have been proposed. However, the numerical stability, specifically, the convergence speed of the parallel variants of Bi-CGSTAB may become worse than that of the standard Bi-CGSTAB. In this paper, therefore, we compare the convergence speed between the standard Bi-CGSTAB and the parallel variants by numerical experiments and show that the convergence speed of the standard Bi-CGSTAB is faster than the parallel variants. Moreover, we propose the stabilization strategy for the parallel variants.

Keywords: bi-conjugate gradient stabilized method, convergence speed, Krylov subspace methods, linear equations, parallel variant

Procedia PDF Downloads 157
14346 A Framework for Auditing Multilevel Models Using Explainability Methods

Authors: Debarati Bhaumik, Diptish Dey

Abstract:

Multilevel models, increasingly deployed in industries such as insurance, food production, and entertainment within functions such as marketing and supply chain management, need to be transparent and ethical. Applications usually result in binary classification within groups or hierarchies based on a set of input features. Using open-source datasets, we demonstrate that popular explainability methods, such as SHAP and LIME, consistently underperform inaccuracy when interpreting these models. They fail to predict the order of feature importance, the magnitudes, and occasionally even the nature of the feature contribution (negative versus positive contribution to the outcome). Besides accuracy, the computational intractability of SHAP for binomial classification is a cause of concern. For transparent and ethical applications of these hierarchical statistical models, sound audit frameworks need to be developed. In this paper, we propose an audit framework for technical assessment of multilevel regression models focusing on three aspects: (i) model assumptions & statistical properties, (ii) model transparency using different explainability methods, and (iii) discrimination assessment. To this end, we undertake a quantitative approach and compare intrinsic model methods with SHAP and LIME. The framework comprises a shortlist of KPIs, such as PoCE (Percentage of Correct Explanations) and MDG (Mean Discriminatory Gap) per feature, for each of these three aspects. A traffic light risk assessment method is furthermore coupled to these KPIs. The audit framework will assist regulatory bodies in performing conformity assessments of AI systems using multilevel binomial classification models at businesses. It will also benefit businesses deploying multilevel models to be future-proof and aligned with the European Commission’s proposed Regulation on Artificial Intelligence.

Keywords: audit, multilevel model, model transparency, model explainability, discrimination, ethics

Procedia PDF Downloads 87
14345 Regional Flood Frequency Analysis in Narmada Basin: A Case Study

Authors: Ankit Shah, R. K. Shrivastava

Abstract:

Flood and drought are two main features of hydrology which affect the human life. Floods are natural disasters which cause millions of rupees’ worth of damage each year in India and the whole world. Flood causes destruction in form of life and property. An accurate estimate of the flood damage potential is a key element to an effective, nationwide flood damage abatement program. Also, the increase in demand of water due to increase in population, industrial and agricultural growth, has let us know that though being a renewable resource it cannot be taken for granted. We have to optimize the use of water according to circumstances and conditions and need to harness it which can be done by construction of hydraulic structures. For their safe and proper functioning of hydraulic structures, we need to predict the flood magnitude and its impact. Hydraulic structures play a key role in harnessing and optimization of flood water which in turn results in safe and maximum use of water available. Mainly hydraulic structures are constructed on ungauged sites. There are two methods by which we can estimate flood viz. generation of Unit Hydrographs and Flood Frequency Analysis. In this study, Regional Flood Frequency Analysis has been employed. There are many methods for estimating the ‘Regional Flood Frequency Analysis’ viz. Index Flood Method. National Environmental and Research Council (NERC Methods), Multiple Regression Method, etc. However, none of the methods can be considered universal for every situation and location. The Narmada basin is located in Central India. It is drained by most of the tributaries, most of which are ungauged. Therefore it is very difficult to estimate flood on these tributaries and in the main river. As mentioned above Artificial Neural Network (ANN)s and Multiple Regression Method is used for determination of Regional flood Frequency. The annual peak flood data of 20 sites gauging sites of Narmada Basin is used in the present study to determine the Regional Flood relationships. Homogeneity of the considered sites is determined by using the Index Flood Method. Flood relationships obtained by both the methods are compared with each other, and it is found that ANN is more reliable than Multiple Regression Method for the present study area.

Keywords: artificial neural network, index flood method, multi layer perceptrons, multiple regression, Narmada basin, regional flood frequency

Procedia PDF Downloads 415
14344 Stress Corrosion Cracking, Parameters Affecting It, Problems Caused by It and Suggested Methods for Treatment: State of the Art

Authors: Adnan Zaid

Abstract:

Stress corrosion cracking (SCC) may be defined as a degradation of the mechanical properties of a material under the combined action of a tensile stress and corrosive environment of the susceptible material. It is a harmful phenomenon which might cause catastrophic fracture without a sign of prior warning. In this paper, the stress corrosion cracking, SCC, process, the parameters affecting it, and the different damages caused by it are given and discussed. Utilization of shot peening as a mean of enhancing the resistance of materials to SCC is given and discussed. Finally, a method for improving materials resistance to SCC by grain refining its structure by some refining elements prior to usage is suggested.

Keywords: stress corrosion cracking, parameters, damages, treatment methods

Procedia PDF Downloads 324
14343 Studies on the Proximate Composition and Functional Properties of Extracted Cocoyam Starch Flour

Authors: Adebola Ajayi, Francis B. Aiyeleye, Olakunke M. Makanjuola, Olalekan J. Adebowale

Abstract:

Cocoyam, a generic term for both xanthoma and colocasia, is a traditional staple root crop in many developing countries in Africa, Asia and the Pacific. It is mostly cultivated as food crop which is very rich in vitamin B6, magnesium and also in dietary fiber. The cocoyam starch is easily digested and often used for baby food. Drying food is a method of food preservation that removes enough moisture from the food so bacteria, yeast and molds cannot grow. It is a one of the oldest methods of preserving food. The effect of drying methods on the proximate composition and functional properties of extracted cocoyam starch flour were studied. Freshly harvested cocoyam cultivars at matured level were washed with portable water, peeled, washed and grated. The starch in the grated cocoyam was extracted, dried using sun drying, oven and cabinet dryers. The extracted starch flour was milled into flour using Apex mill and packed and sealed in low-density polyethylene film (LDPE) 75 micron thickness with Nylon sealing machine QN5-3200HI and kept for three months under ambient temperature before analysis. The result showed that the moisture content, ash, crude fiber, fat, protein and carbohydrate ranged from 6.28% to 12.8% 2.32% to 3.2%, 0.89% to 2.24%%, 1.89% to 2.91%, 7.30% to 10.2% and 69% to 83% respectively. The functional properties of the cocoyam starch flour ranged from 2.65ml/g to 4.84ml/g water absorption capacity, 1.95ml/g to 3.12ml/g oil absorption capacity, 0.66ml/g to 7.82ml/g bulk density and 3.82% to 5.30ml/g swelling capacity. Significant difference (P≥0.5) was not obtained across the various drying methods used. The drying methods provide extension to the shelf-life of the extracted cocoyam starch flour.

Keywords: cocoyam, extraction, oven dryer, cabinet dryer

Procedia PDF Downloads 288
14342 Condition Monitoring of a 3-Ø Induction Motor by Vibration Spectrum Analysis Using FFT Analyzer, a Case Study

Authors: Adinarayana S., Sudhakar I.

Abstract:

Energy conversion is one of the inevitable parts of any industries. It involves either conversion of mechanical energy in to electrical or vice versa. The later conversion of energy i.e. electrical to mechanical emphasizes the need of motor. Statistics reveals, about 8 % of industries’ annual turnover met on maintenance. Thus substantial numbers of efforts are required to minimize in incurring expenditure met towards break down maintenance. Condition monitoring is one of such techniques based on vibration widely used to recognize premature failures and paves a way to minimize cumbersome involved during breakdown of machinery. The present investigation involves a case study of squirrel cage induction motor (frequently in the electro machines) has been chosen for the conditional monitoring to predict its soundness on the basis of results of FFT analyser. Accelerometer which measures the acceleration converts in to impulses by FFT analyser generates vibration spectrum and time spectrum has been located at various positions on motor under different conditions. Results obtained from the FFT analyser are compared to that of ISO standard vibration severity charts are taken to predict the preventative condition of considered machinery. Initial inspection of motor revealed that stator faults, broken end rings in rotor, eccentricity faults and misalignment between bearings are trouble shootings areas for present investigation. From the results of the shaft frequencies, it can be perceived that there is a misalignment between the bearings at both the ends. The higher order harmonics of FTF shows the presence of cracks on the race of the bearings at both the ends which are in the incipient stage. Replacement of the bearings at both the drive end (6306) and non drive end (6206) and the alignment check between the bearings in the shaft are suggested as the constructive measures towards preventive maintenance of considered squirrel cage induction motor.

Keywords: FFT analyser, condition monitoring, vibration spectrum, time wave form

Procedia PDF Downloads 383
14341 Collective Problem Solving: Tackling Obstacles and Unlocking Opportunities for Young People Not in Education, Employment, or Training

Authors: Kalimah Ibrahiim, Israa Elmousa

Abstract:

This study employed the world café method alongside semi-structured interviews within a 'conversation café' setting to engage stakeholders from the public health and primary care sectors. The objective was to collaboratively explore strategies to improve outcomes for young people not in education, employment, or training (NEET). The discussions were aimed at identifying the underlying causes of disparities faced by NEET individuals, exchanging experiences, and formulating community-driven solutions to bolster preventive efforts and shape policy initiatives. A thematic analysis of the qualitative data gathered emphasized the importance of community problem-solving through the exchange of ideas and reflective discussions. Healthcare professionals reflected on their potential roles, pinpointing a significant gap in understanding the specific needs of the NEET population and the unclear distribution of responsibilities among stakeholders. The results underscore the necessity for a unified approach in primary care and the fostering of multi-agency collaborations that focus on addressing social determinants of health. Such strategies are critical not only for the immediate improvement of health outcomes for NEET individuals but also for informing broader policy decisions that can have long-term benefits. Further research is ongoing, delving deeper into the unique challenges faced by this demographic and striving to develop more effective interventions. The study advocates for continued efforts to integrate insights from various sectors to create a more holistic and effective response to the needs of the NEET population, ensuring that future strategies are informed by a comprehensive understanding of their circumstances and challenges.

Keywords: multi-agency working, primary care, public health, social inequalities

Procedia PDF Downloads 25
14340 Patents as Indicators of Innovative Environment

Authors: S. Karklina, I. Erins

Abstract:

The main problem is that there is a very low innovation performance in Latvia. Since Latvia is a Member State of European Union, it also shall have to fulfill the set targets and to improve innovative results. Universities are one of the main performers to provide innovative capacity of country. University, industry and government need to cooperate for getting best results. The intellectual property is one of the indicators to determine innovation level in the country or organization and patents are one of the characteristics of intellectual property. The objective of the article is to determine indicators characterizing innovative environment in Latvia and influence of the development of universities on them. The methods that will be used in the article to achieve the objectives are quantitative and qualitative analysis of the literature, statistical data analysis, and graphical analysis methods.

Keywords: HEI, innovations, Latvia, patents

Procedia PDF Downloads 312
14339 The Effect of the Acquisition and Reconstruction Parameters in Quality of Spect Tomographic Images with Attenuation and Scatter Correction

Authors: N. Boutaghane, F. Z. Tounsi

Abstract:

Many physical and technological factors degrade the SPECT images, both qualitatively and quantitatively. For this, it is not always put into leading technological advances to improve the performance of tomographic gamma camera in terms of detection, collimation, reconstruction and correction of tomographic images methods. We have to master firstly the choice of various acquisition and reconstruction parameters, accessible to clinical cases and using the attenuation and scatter correction methods to always optimize quality image and minimized to the maximum dose received by the patient. In this work, an evaluation of qualitative and quantitative tomographic images is performed based on the acquisition parameters (counts per projection) and reconstruction parameters (filter type, associated cutoff frequency). In addition, methods for correcting physical effects such as attenuation and scatter degrading the image quality and preventing precise quantitative of the reconstructed slices are also presented. Two approaches of attenuation and scatter correction are implemented: the attenuation correction by CHANG method with a filtered back projection reconstruction algorithm and scatter correction by the subtraction JASZCZAK method. Our results are considered as such recommandation, which permits to determine the origin of the different artifacts observed both in quality control tests and in clinical images.

Keywords: attenuation, scatter, reconstruction filter, image quality, acquisition and reconstruction parameters, SPECT

Procedia PDF Downloads 442
14338 Aerodynamic Design an UAV and Stability Analysis with Method of Genetic Algorithm Optimization

Authors: Saul A. Torres Z., Eduardo Liceaga C., Alfredo Arias M.

Abstract:

We seek to develop a UAV for agricultural spraying at a maximum altitude of 5000 meters above sea level, with a payload of 100 liters of fumigant. For the developing the aerodynamic design of the aircraft is using computational tools such as the "Vortex Lattice Athena" software, "MATLAB", "ANSYS FLUENT", "XFoil" package among others. Also methods are being used structured programming, exhaustive analysis of optimization methods and search. The results have a very low margin of error, and the multi-objective problems can be helpful for future developments. Also we developed method for Stability Analysis (Lateral-Directional and Longitudinal).

Keywords: aerodynamics design, optimization, algorithm genetic, multi-objective problem, longitudinal stability, lateral-directional stability

Procedia PDF Downloads 584
14337 Evaluating the Performance of Color Constancy Algorithm

Authors: Damanjit Kaur, Avani Bhatia

Abstract:

Color constancy is significant for human vision since color is a pictorial cue that helps in solving different visions tasks such as tracking, object recognition, or categorization. Therefore, several computational methods have tried to simulate human color constancy abilities to stabilize machine color representations. Two different kinds of methods have been used, i.e., normalization and constancy. While color normalization creates a new representation of the image by canceling illuminant effects, color constancy directly estimates the color of the illuminant in order to map the image colors to a canonical version. Color constancy is the capability to determine colors of objects independent of the color of the light source. This research work studies the most of the well-known color constancy algorithms like white point and gray world.

Keywords: color constancy, gray world, white patch, modified white patch

Procedia PDF Downloads 312
14336 How Participatory Climate Information Services Assist Farmers to Uptake Rice Disease Forecasts and Manage Diseases in Advance: Evidence from Coastal Bangladesh

Authors: Moriom Akter Mousumi, Spyridon Paparrizos, Fulco Ludwig

Abstract:

Rice yield reduction due to climate change-induced disease occurrence is becoming a great concern for coastal farmers of Bangladesh. The development of participatory climate information services (CIS) based on farmers’ needs could implicitly facilitate farmers to get disease forecasts and make better decisions to manage diseases. Therefore, this study aimed to investigate how participatory climate information services assist coastal rice farmers to take up rice disease forecasts and better manage rice diseases by improving their informed decision-making. Through participatory approaches, we developed a tailor-made agrometeorological service through the DROP app to forecast rice diseases and manage them in advance. During farmers field schools (FFS) we communicated 7-day disease forecasts during face-to-face weekly meetings using printed paper and, messenger app derived from DROP app. Results show that the majority of the farmers understand disease forecasts through visualization, symbols, and text. The majority of them use disease forecast information directly from the DROP app followed by face-to-face meetings, messenger app, and printed paper. Farmers participation and engagement during capacity building training at FFS also assist them in making more informed decisions and improved management of diseases using both preventive measures and chemical measures throughout the rice cultivation period. We conclude that the development of participatory CIS and the associated capacity-building and training of farmers has increased farmers' understanding and uptake of disease forecasts to better manage of rice diseases. Participatory services such as the DROP app offer great potential as an adaptation option for climate-smart rice production under changing climatic conditions.

Keywords: participatory climate service, disease forecast, disease management, informed decision making, coastal Bangladesg

Procedia PDF Downloads 43
14335 Treatment of Type 2 Diabetes Mellitus: Physicians’ Adherence to the American Diabetes Association Guideline in Central Region, Saudi Arabia

Authors: Ibrahim Mohammed

Abstract:

Background: Diabetes mellitus is a chronic disease that can cause devastating secondary complications, reducing the quality and length of life as well as increasing medical costs for the patient and society. The guidelines recommend both clinical and preventive strategies for diabetes management and are regularly updated. The aim of the study is to assess the level of adherence of physicians to American Diabetes Association Guidelines. Method: Observational multicenter retrospective study will be conducted among different hospitals in the central region. Patient data will be collected from the records of the last three years (2017- 2020). Records will be selected randomly after a complete randomized design. The study focuses on the management of type 2 according to ADA not changed in the last three updating; those standards; all patients should be taking Metformin 1500 to 2000 mg/day as recommended dose and should be received a high dose of statin if the high risk to ASCVD or moderate statin if not at risk, patients with hypertension and diabetes should taking ACE or ARBS. Result: The study aimed to evaluate the commitment of physicians in the central region to the ADA. Out of the 153 selected patients, only 17 % were able to control their diabetes with an average A1c below 7. ADA stated that to reach the minimum benefit of using Metformin, the daily dose should be between 1500 and 2000 mg. Results showed that 110 patients were on Metformin, where 68% of them were on the recommended dose. ADA recommended the intake of high statin for diabetic patients with ASCVD risk, while diabetic patients without ASCVD risk should be on a moderate statin. Results showed that 61.5% of patients with ASCVD risk were at high statin while only 36% of patients without ASCVD risk were at moderate statin. Results showed that 89 patients have hypertension, and 80% of them are getting ACE/ARBs as recommended by the ADA. Recommendation: It is necessary to implement periodic training courses for some physicians to enhance and update their knowledge.

Keywords: American Diabetic Association, diabetes mellitus, atherosclerotic cardiovascular disease, ACE inhibitors

Procedia PDF Downloads 82
14334 Variable Selection in a Data Envelopment Analysis Model by Multiple Proportions Comparison

Authors: Jirawan Jitthavech, Vichit Lorchirachoonkul

Abstract:

A statistical procedure using multiple comparisons test for proportions is proposed for variable selection in a data envelopment analysis (DEA) model. The test statistic in the multiple comparisons is the proportion of efficient decision making units (DMUs) in a DEA model. Three methods of multiple comparisons test for proportions: multiple Z tests with Bonferroni correction, multiple tests in 2Xc crosstabulation and the Marascuilo procedure, are used in the proposed statistical procedure of iteratively eliminating the variables in a backward manner. Two simulation populations of moderately and lowly correlated variables are used to compare the results of the statistical procedure using three methods of multiple comparisons test for proportions with the hypothesis testing of the efficiency contribution measure. From the simulation results, it can be concluded that the proposed statistical procedure using multiple Z tests for proportions with Bonferroni correction clearly outperforms the proposed statistical procedure using the remaining two methods of multiple comparisons and the hypothesis testing of the efficiency contribution measure.

Keywords: Bonferroni correction, efficient DMUs, Marascuilo procedure, Pastor et al. method, 2xc crosstabulation

Procedia PDF Downloads 307
14333 Field Scale Simulation Study of Miscible Water Alternating CO2 Injection Process in Fractured Reservoirs

Authors: Hooman Fallah

Abstract:

Vast amounts of world oil reservoirs are in natural fractured reservoirs. There are different methods for increasing recovery from fractured reservoirs. Miscible injection of water alternating CO2 is a good choice among this methods. In this method, water and CO2 slugs are injected alternatively in reservoir as miscible agent into reservoir. This paper studies water injection scenario and miscible injection of water and CO2 in a two dimensional, inhomogeneous fractured reservoir. The results show that miscible water alternating CO2¬ gas injection leads to 3.95% increase in final oil recovery and total water production decrease of 3.89% comparing to water injection scenario.

Keywords: simulation study, CO2, water alternating gas injection, fractured reservoirs

Procedia PDF Downloads 287
14332 Condition Monitoring of a 3-Ø Induction Motor by Vibration Spectrum Analysis Using FFT Analyzer- a Case Study

Authors: Adi Narayana S Sudhakar. I

Abstract:

Energy conversion is one of the inevitable parts of any industries. It involves either conversion of mechanical energy in to electrical or vice versa. The later conversion of energy i.e. electrical to mechanical emphasizes the need of motor .Statistics reveals, about 8 % of industries’ annual turnover met on maintenance. Thus substantial numbers of efforts are required to minimize in incurring expenditure met towards break down maintenance. Condition monitoring is one of such techniques based on vibration widely used to recognize premature failures and paves a way to minimize cumbersome involved during breakdown of machinery. The present investigation involves a case study of squirrel cage induction motor (frequently in the electro machines) has been chosen for the conditional monitoring to predict its soundness on the basis of results of FFT analyser. Accelerometer which measures the acceleration converts in to impulses by FFT analyser generates vibration spectrum and time spectrum has been located at various positions on motor under different conditions. Results obtained from the FFT analyzer are compared to that of ISO standard vibration severity charts are taken to predict the preventative condition of considered machinery. Initial inspection of motor revealed that stator faults, broken end rings in rotor, eccentricity faults and misalignment between bearings are trouble shootings areas for present investigation. From the results of the shaft frequencies, it can be perceived that there is a misalignment between the bearings at both the ends. The higher order harmonics of FTF shows the presence of cracks on the race of the bearings at both the ends which are in the incipient stage. Replacement of the bearings at both the drive end (6306) and non-drive end (6206) and the alignment check between the bearings in the shaft are suggested as the constructive measures towards preventive maintenance of considered squirrel cage induction motor.

Keywords: FFT analyser, condition monitoring, vibration spectrum, time spectrum accelerometer

Procedia PDF Downloads 447
14331 Logical-Probabilistic Modeling of the Reliability of Complex Systems

Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia

Abstract:

The paper presents logical-probabilistic methods, models and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of weights of elements. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research and designing of optimal structure systems are carried out.

Keywords: Complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability, weight of element

Procedia PDF Downloads 66
14330 Accelerating Side Channel Analysis with Distributed and Parallelized Processing

Authors: Kyunghee Oh, Dooho Choi

Abstract:

Although there is no theoretical weakness in a cryptographic algorithm, Side Channel Analysis can find out some secret data from the physical implementation of a cryptosystem. The analysis is based on extra information such as timing information, power consumption, electromagnetic leaks or even sound which can be exploited to break the system. Differential Power Analysis is one of the most popular analyses, as computing the statistical correlations of the secret keys and power consumptions. It is usually necessary to calculate huge data and takes a long time. It may take several weeks for some devices with countermeasures. We suggest and evaluate the methods to shorten the time to analyze cryptosystems. Our methods include distributed computing and parallelized processing.

Keywords: DPA, distributed computing, parallelized processing, side channel analysis

Procedia PDF Downloads 422
14329 Mediation in Turkey

Authors: Ibrahim Ercan, Mustafa Arikan

Abstract:

In recent years, alternative dispute resolution methods have attracted the attention of many country’s legislators. Instead of solving the disputes by litigation, putting the end to a dispute by parties themselves is more important for the preservation of social peace. Therefore, alternative dispute resolution methods (ADR) have been discussed more intensively in Turkey as well as the whole world. After these discussions, Mediation Act was adopted on 07.06.2012 and entered into force on 21.06.2013. According to the Mediation Act, it is only possible to mediate issues arising from the private law. Also, it is not compulsory to go to mediation in Turkish law, it is optional. Therefore, the parties are completely free to choose mediation method in dispute resolution. Mediators need to be a lawyer with experience in five years. Therefore, it is not possible to be a mediator who is not lawyers. Beyond five years of experience, getting education and success in exams about especially body language and psychology is also very important to be a mediator. If the parties compromise as a result of mediation, a document is issued. This document will also have the ability to exercising availability under certain circumstances. Thus, the parties will not need to apply to the court again. On the contrary, they will find the opportunity to execute this document, so they can regain their debts. However, the Mediation Act has entered into force in a period of nearly two years of history; it is possible to say that the interest in mediation is not at the expected level. Therefore, making mediation mandatory for some disputes has been discussed recently. At this point, once the mediation becomes mandatory and good results follows it, this institution will be able to find a serious interest in Turkey. Otherwise, if the results will not be satisfying, the mediation method will be removed.

Keywords: alternative dispute resolution methods, mediation act, mediation, mediator, mediation in Turkey

Procedia PDF Downloads 358
14328 Application of Adaptive Particle Filter for Localizing a Mobile Robot Using 3D Camera Data

Authors: Maysam Shahsavari, Seyed Jamalaldin Haddadi

Abstract:

There are several methods to localize a mobile robot such as relative, absolute and probabilistic. In this paper, particle filter due to its simple implementation and the fact that it does not need to know to the starting position will be used. This method estimates the position of the mobile robot using a probabilistic distribution, relying on a known map of the environment instead of predicting it. Afterwards, it updates this estimation by reading input sensors and control commands. To receive information from the surrounding world, distance to obstacles, for example, a Kinect is used which is much cheaper than a laser range finder. Finally, after explaining the Adaptive Particle Filter method and its implementation in detail, we will compare this method with the dead reckoning method and show that this method is much more suitable for situations in which we have a map of the environment.

Keywords: particle filter, localization, methods, odometry, kinect

Procedia PDF Downloads 260
14327 A Review Paper for Detecting Zero-Day Vulnerabilities

Authors: Tshegofatso Rambau, Tonderai Muchenje

Abstract:

Zero-day attacks (ZDA) are increasing day by day; there are many vulnerabilities in systems and software that date back decades. Companies keep discovering vulnerabilities in their systems and software and work to release patches and updates. A zero-day vulnerability is a software fault that is not widely known and is unknown to the vendor; attackers work very quickly to exploit these vulnerabilities. These are major security threats with a high success rate because businesses lack the essential safeguards to detect and prevent them. This study focuses on the factors and techniques that can help us detect zero-day attacks. There are various methods and techniques for detecting vulnerabilities. Various companies like edges can offer penetration testing and smart vulnerability management solutions. We will undertake literature studies on zero-day attacks and detection methods, as well as modeling approaches and simulations, as part of the study process.

Keywords: zero-day attacks, exploitation, vulnerabilities

Procedia PDF Downloads 94
14326 Evaluation of Microbiological Quality and Safety of Two Types of Salads Prepared at Libyan Airline Catering Center in Tripoli

Authors: Elham A. Kwildi, Yahia S. Abugnah, Nuri S. Madi

Abstract:

This study was designed to evaluate the microbiological quality and safety of two types of salads prepared at a catering center affiliated with Libyan Airlines in Tripoli, Libya. Two hundred and twenty-one (221) samples (132 economy-class and 89 first- class) were used in this project which lasted for ten months. Biweekly, microbiological tests were performed which included total plate count (TPC) and total coliforms (TCF), in addition to enumeration and/or detection of some pathogenic bacteria mainly Escherichia coli, Staphylococcus aureus, Bacillus cereus, Salmonella sp, Listeria sp and Vibrio parahaemolyticus parahaemolyticus, By using conventional as well as compact dry methods. Results indicated that TPC of type 1 salad ranged between (<10 – 62 x 103 cfu/gm) and (<10 to 36 x103 cfu/g), while TCF were (<10 – 41 x 103 cfu/gm) and (< 10 to 66 x102 cfu/g) using both methods of detection respectively. On the other hand, TPC of type 2 salad were: (1 × 10 – 52 x 103) and (<10 – 55 x 103 cfu/gm) and in the range of (1 x10 to 45x103 cfu/g), and the (TCF) counts were between (< 10 to 55x103 cfu/g) and (< 10 to 34 x103 cfu/g) using the 1st and the 2nd methods of detection respectively. Also, the pathogens mentioned above were detected in both types of salads, but their levels varied according to the type of salad and the method of detection. The level of Staphylococcus aureus, for instance, was 17.4% using conventional method versus 14.4% using the compact dry method. Similarly, E. coli was 7.6% and 9.8%, while Salmonella sp. recorded the least percentage i.e. 3% and 3.8% with the two mentioned methods respectively. First class salads were also found to contain the same pathogens, but the level of E. coli was relatively higher in this case (14.6% and 16.9%) using conventional and compact dry methods respectively. The second rank came Staphylococcus aureus (13.5%) and (11.2%), followed by Salmonella (6.74%) and 6.70%). The least percentage was for Vibrio parahaemolyticus (4.9%) which was detected in the first class salads only. The other two pathogens Bacillus cereus and Listeria sp. were not detected in either one of the salads. Finally, it is worth mentioning that there was a significant decline in TPC and TCF counts in addition to the disappearance of pathogenic bacteria after the 6-7th month of the study which coincided with the first trial of the HACCP system at the center. The ups and downs in the counts along the early stages of the study reveal that there is a need for some important correction measures including more emphasis on training of the personnel in applying the HACCP system effectively.

Keywords: air travel, vegetable salads, foodborne outbreaks, Libya

Procedia PDF Downloads 317
14325 Attenuation of Endotoxin Induced Hepatotoxicity by Dexamethasone, Melatonin and Pentoxifylline in White Albino Mice: A Comparative Study

Authors: Ammara Khan

Abstract:

Sepsis is characterized by an overwhelming surge of cytokines and oxidative stress to one of many factors, gram-negative bacteria commonly implicated. Despite major expansion and elaboration of sepsis pathophysiology and therapeutic approach; death rate remains very high in septic patients due to multiple organ damages including hepatotoxicity.The present study was aimed to ascertain the adequacy of three different drugs delivered separately and collectively- low dose steroid-dexamethasone (3mg/kg i.p) ,antioxidant-melatonin(10 mg/kg i.p) ,and phosphodiesterases inhibitor - pentoxifylline (75 mg/kg i.p)in endotoxin-induced hepatotoxicity in mice. Endotoxin/lipopolysaccharides induced hepatotoxicity was reproduced in mice by giving lipopolysaccharide of serotype E.Coli intraperitoneally. The preventive role was questioned by giving the experimental agent half an hour prior to LPS injection whereas the therapeutic potential of the experimental agent was searched out via post-LPS delivering. The extent of liver damage was adjudged via serum alanine aminotransferases (ALT) and aspartate aminotransferase (AST) estimation along with a histopathological examination of liver tissue. Dexamethasone is given before (Group 3) and after LPS (group 4) significantly attenuated LPS generated liver injury.Pentoxifylline generated similar results and serum ALT; AST histological alteration abated considerably (p≤ 0.05) both in animals subjected to pentoxifylline pre (Group 5) and post-treatment(Group 6). Melatonin was also prosperous in aversion (Group 7) and curation (Group 8) of LPS invoked hepatotoxicity as evident by lessening of augmented ALT (≤0.01) and AST (≤0.01) along with restoration of pathological changes in liver sections (p≤0.05). Combination therapies with dexamethasone in conjunction with melatonin (Group 9), dexamethasone together with pentoxifylline (Group 10), and pentoxifylline along with melatonin (Group 11) after LPS administration tapered LPS evoked hepatic dysfunction statistically considerably. In conclusion, both melatonin and pentoxifylline set up promising results in endotoxin-induced hepatotoxicity and can be used therapeutic adjuncts to conventional treatment strategies in sepsis-induced liver failure.

Keywords: endotoxin/lipopolysacchride, dexamethasone, hepatotoxicity, melatonin, pentoxifylline

Procedia PDF Downloads 274
14324 Computational Fluid Dynamics Simulation Study of Flow near Moving Wall of Various Surface Types Using Moving Mesh Method

Authors: Khizir Mohd Ismail, Yu Jun Lim, Tshun Howe Yong

Abstract:

The study of flow behavior in an enclosed volume using Computational Fluid Dynamics (CFD) has been around for decades. However, due to the knowledge limitation of adaptive grid methods, the flow in an enclosed volume near the moving wall using CFD is less explored. A CFD simulation of flow in an enclosed volume near a moving wall was demonstrated and studied by introducing a moving mesh method and was modeled with Unsteady Reynolds-Averaged Navier-Stokes (URANS) approach. A static enclosed volume with controlled opening size in the bottom was positioned against a moving, translational wall with sliding mesh features. Controlled variables such as smoothed, crevices and corrugated wall characteristics, the distance between the enclosed volume to the wall and the moving wall speed against the enclosed chamber were varied to understand how the flow behaves and reacts in between these two geometries. These model simulations were validated against experimental results and provided result confidence when the simulation had shown good agreement with the experimental data. This study had provided better insight into the flow behaving in an enclosed volume when various wall types in motion were introduced within the various distance between each other and create a potential opportunity of application which involves adaptive grid methods in CFD.

Keywords: moving wall, adaptive grid methods, CFD, moving mesh method

Procedia PDF Downloads 142
14323 Development of Cost-effective Sensitive Methods for Pathogen Detection in Community Wastewater for Disease Surveillance

Authors: Jesmin Akter, Chang Hyuk Ahn, Ilho Kim, Jaiyeop Lee

Abstract:

Global pandemic coronavirus disease (COVID-19) caused by Severe acute respiratory syndrome SARS-CoV-2, to control the spread of the COVID-19 pandemic, wastewater surveillance has been used to monitor SARS-CoV2 prevalence in the community. The challenging part is establishing wastewater surveillance; there is a need for a well-equipped laboratory for wastewater sample analysis. According to many previous studies, reverse transcription-polymerase chain reaction (RT-PCR) based molecular tests are the most widely used and popular detection method worldwide. However, the RT-qPCR based approaches for the detection or quantification of SARS-CoV-2 genetic fragments ribonucleic acid (RNA) from wastewater require a specialized laboratory, skilled personnel, expensive instruments, and a workflow that typically requires 6 to 8 hours to provide results for just minimum samples. Rapid and reliable alternative detection methods are needed to enable less-well-qualified practitioners to set up and provide sensitive detection of SARS-CoV-2 within wastewater at less-specialized regional laboratories. Therefore, scientists and researchers are conducting experiments for rapid detection methods of COVID-19; in some cases, the structural and molecular characteristics of SARS-CoV-2 are unknown, and various strategies for the correct diagnosis of COVID-19 have been proposed by research laboratories, which are presented in the present study. The ongoing research and development of these highly sensitive and rapid technologies, namely RT-LAMP, ELISA, Biosensors, GeneXpert, allows a wide range of potential options not only for SARS-CoV-2 detection but also for other viruses as well. The effort of this study is to discuss the above effective and regional rapid detection and quantification methods in community wastewater as an essential step in advancing scientific goals.

Keywords: rapid detection, SARS-CoV-2, sensitive detection, wastewater surveillance

Procedia PDF Downloads 82
14322 Molecular Biomonitoring of Bacterial Pathogens in Wastewater

Authors: Desouky Abd El Haleem, Sahar Zaki

Abstract:

This work was conducted to develop a one-step multiplex PCR system for rapid, sensitive, and specific detection of three different bacterial pathogens, Escherichia coli, Pseudomonas aeruginosa, and Salmonella spp, directly in wastewater without prior isolation on selective media. As a molecular confirmatory test after isolation of the pathogens by classical microbiological methods, PCR-RFLP of their amplified 16S rDNA genes was performed. It was observed that the developed protocols have significance impact in the ability to detect sensitively, rapidly and specifically the three pathogens directly in water within short-time, represents a considerable advancement over more time-consuming and less-sensitive methods for identification and characterization of these kinds of pathogens.

Keywords: multiplex PCR, bacterial pathogens, Escherichia coli, Pseudomonas aeruginosa, Salmonella spp.

Procedia PDF Downloads 444
14321 Investigation of Long-Term Thermal Insulation Performance of Vacuum Insulation Panels with Various Enveloping Methods

Authors: Inseok Yeo, Tae-Ho Song

Abstract:

To practically apply vacuum insulation panels (VIPs) to buildings or home appliances, VIPs have demanded long-term lifespan with outstanding insulation performance. Service lives of VIPs enveloped with Al-foil and three-layer Al-metallized envelope are calculated. For Al-foil envelope, the service life is longer but edge conduction is too large compared with the Al metallized envelope. To increase service life even more, the proposed double enveloping method and metal-barrier-added enveloping method are further analyzed. The service lives of the VIP to employ two enveloping methods are calculated. Also, pressure increase and thermal insulation performance characteristics are investigated. For the metal- barrier-added enveloping method, effective thermal conductivity increase with time is close to that of Al-foil envelope, especially, for getter-inserted VIPs. For the double enveloping method, if water vapor is perfectly adsorbed, the effect of service life enhancement becomes much greater. From these methods, the VIP can be guaranteed for the service life of more than 20 years.

Keywords: vacuum insulation panels, service life, double enveloping, metal-barrier-added enveloping, edge conduction

Procedia PDF Downloads 429
14320 Comparison of Finite-Element and IEC Methods for Cable Thermal Analysis under Various Operating Environments

Authors: M. S. Baazzim, M. S. Al-Saud, M. A. El-Kady

Abstract:

In this paper, steady-state ampacity (current carrying capacity) evaluation of underground power cable system by using analytical and numerical methods for different conditions (depth of cable, spacing between phases, soil thermal resistivity, ambient temperature, wind speed), for two system voltage level were used 132 and 380 kV. The analytical method or traditional method that was used is based on the thermal analysis method developed by Neher-McGrath and further enhanced by International Electrotechnical Commission (IEC) and published in standard IEC 60287. The numerical method that was used is finite element method and it was recourse commercial software based on finite element method.

Keywords: cable ampacity, finite element method, underground cable, thermal rating

Procedia PDF Downloads 373
14319 Survey of Methods for Solutions of Spatial Covariance Structures and Their Limitations

Authors: Joseph Thomas Eghwerido, Julian I. Mbegbu

Abstract:

In modelling environment processes, we apply multidisciplinary knowledge to explain, explore and predict the Earth's response to natural human-induced environmental changes. Thus, the analysis of spatial-time ecological and environmental studies, the spatial parameters of interest are always heterogeneous. This often negates the assumption of stationarity. Hence, the dispersion of the transportation of atmospheric pollutants, landscape or topographic effect, weather patterns depends on a good estimate of spatial covariance. The generalized linear mixed model, although linear in the expected value parameters, its likelihood varies nonlinearly as a function of the covariance parameters. As a consequence, computing estimates for a linear mixed model requires the iterative solution of a system of simultaneous nonlinear equations. In other to predict the variables at unsampled locations, we need to know the estimate of the present sampled variables. The geostatistical methods for solving this spatial problem assume covariance stationarity (locally defined covariance) and uniform in space; which is not apparently valid because spatial processes often exhibit nonstationary covariance. Hence, they have globally defined covariance. We shall consider different existing methods of solutions of spatial covariance of a space-time processes at unsampled locations. This stationary covariance changes with locations for multiple time set with some asymptotic properties.

Keywords: parametric, nonstationary, Kernel, Kriging

Procedia PDF Downloads 249
14318 Maintenance Performance Measurement Derived Optimization: A Case Study

Authors: James M. Wakiru, Liliane Pintelon, Peter Muchiri, Stanley Mburu

Abstract:

Maintenance performance measurement (MPM) represents an integrated aspect that considers both operational and maintenance related aspects while evaluating the effectiveness and efficiency of maintenance to ensure assets are working as they should. Three salient issues require to be addressed for an asset-intensive organization to employ an MPM-based framework to optimize maintenance. Firstly, the organization should establish important perfomance metric(s), in this case the maintenance objective(s), which they will be focuss on. The second issue entails aligning the maintenance objective(s) with maintenance optimization. This is achieved by deriving maintenance performance indicators that subsequently form an objective function for the optimization program. Lastly, the objective function is employed in an optimization program to derive maintenance decision support. In this study, we develop a framework that initially identifies the crucial maintenance performance measures, and employs them to derive maintenance decision support. The proposed framework is demonstrated in a case study of a geothermal drilling rig, where the objective function is evaluated utilizing a simulation-based model whose parameters are derived from empirical maintenance data. Availability, reliability and maintenance inventory are depicted as essential objectives requiring further attention. A simulation model is developed mimicking a drilling rig operations and maintenance where the sub-systems are modelled undergoing imperfect maintenance, corrective (CM) and preventive (PM), with the total cost as the primary performance measurement. Moreover, three maintenance spare inventory policies are considered; classical (retaining stocks for a contractual period), vendor-managed inventory with consignment stock and periodic monitoring order-to-stock (s, S) policy. Optimization results infer that the adoption of (s, S) inventory policy, increased PM interval and reduced reliance of CM actions offers improved availability and total costs reduction.

Keywords: maintenance, vendor-managed, decision support, performance, optimization

Procedia PDF Downloads 120