Search results for: interval type-2 fuzzy rough set
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1707

Search results for: interval type-2 fuzzy rough set

957 Some Investigations of Primary Slurry Used for Production of Ceramic Shells

Authors: Balwinder Singh

Abstract:

In the current competitive environment, casting industry has several challenges such as production of intricate castings, near net shape castings, decrease lead-time from product design to production, improved casting quality and to control costs. The raw materials used to make ceramic shell play an important role in determining the overall final ceramic shell characteristics. In this work, primary slurries were formulated using various combinations of zircon flour, fused silica and aluminosilicate powders as filler, colloidal silica as binder along with wetting and antifoaming agents (Catalyst). Taguchi’s parameter design strategy has been applied to investigate the effect of primary slurry parameters on the viscosity of the slurry and primary coating of shell. The result reveals that primary coating with low viscosity slurry has produced a rough surface of the shell due to stucco penetration.

Keywords: ceramic shell, primary slurry, filler, slurry viscosity, surface roughness

Procedia PDF Downloads 462
956 Velocity Distribution in Density Currents Flowing over Rough Beds

Authors: Reza Nasrollahpour, Mohamad Hidayat Bin Jamal, Zulhilmi Bin Ismail

Abstract:

Density currents are generated when the fluid of one density is released into another fluid with a different density. These currents occur in a variety of natural and man-made environments, and this emphasises the importance of studying them. In most practical cases, the density currents flow over the surfaces which are not plane; however, there have been limited investigations in this regard. This study uses laboratory experiments to analyse the influence of bottom roughness on the velocity distribution within these dense underflows. The currents are analysed over a plane surface and three different configurations of beam-roughened beds. The velocity profiles are collected using Acoustic Doppler Velocimetry technique, and the distribution of velocity within these currents is formulated for the tested beds. The results indicate that the empirical power and Gaussian relations can describe the velocity distribution in the inner and outer regions of the profiles, respectively. Moreover, it is found that the bottom roughness is the primary controlling parameter in the inner region.

Keywords: density currents, velocity profiles, Acoustic Doppler Velocimeter, bed roughness

Procedia PDF Downloads 166
955 A Tool for Facilitating an Institutional Risk Profile Definition

Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan

Abstract:

This paper presents an approach for the easy creation of an institutional risk profile for endangerment analysis of file formats. The main contribution of this work is the employment of data mining techniques to support risk factors set up with just the most important values that are important for a particular organisation. Subsequently, the risk profile employs fuzzy models and associated configurations for the file format metadata aggregator to support digital preservation experts with a semi-automatic estimation of endangerment level for file formats. Our goal is to make use of a domain expert knowledge base aggregated from a digital preservation survey in order to detect preservation risks for a particular institution. Another contribution is support for visualisation and analysis of risk factors for a requried dimension. The proposed methods improve the visibility of risk factor information and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and automatically aggregated file format metadata from linked open data sources. To facilitate decision-making, the aggregated information about the risk factors is presented as a multidimensional vector. The goal is to visualise particular dimensions of this vector for analysis by an expert. The sample risk profile calculation and the visualisation of some risk factor dimensions is presented in the evaluation section.

Keywords: digital information management, file format, endangerment analysis, fuzzy models

Procedia PDF Downloads 390
954 Injunctions, Disjunctions, Remnants: The Reverse of Unity

Authors: Igor Guatelli

Abstract:

The universe of aesthetic perception entails impasses about sensitive divergences that each text or visual object may be subjected to. If approached through intertextuality that is not based on the misleading notion of kinships or similarities a priori admissible, the possibility of anachronistic, heterogeneous - and non-diachronic - assemblies can enhance the emergence of interval movements, intermediate, and conflicting, conducive to a method of reading, interpreting, and assigning meaning that escapes the rigid antinomies of the mere being and non-being of things. In negative, they operate in a relationship built by the lack of an adjusted meaning set by their positive existences, with no remainders; the generated interval becomes the remnant of each of them; it is the opening that obscures the stable positions of each one. Without the negative of absence, of that which is always missing or must be missing in a text, concept, or image made positive by history, nothing is perceived beyond what has been already given. Pairings or binary oppositions cannot lead only to functional syntheses; on the contrary, methodological disturbances accumulated by the approximation of signs and entities can initiate a process of becoming as an opening to an unforeseen other, transformation until a moment when the difficulties of [re]conciliation become the mainstay of a future of that sign/entity, not envisioned a priori. A counter-history can emerge from these unprecedented, misadjusted approaches, beginnings of unassigned injunctions and disjunctions, in short, difficult alliances that open cracks in a supposedly cohesive history, chained in its apparent linearity with no remains, understood as a categorical historical imperative. Interstices are minority fields that, because of their opening, are capable of causing opacity in that which, apparently, presents itself with irreducible clarity. Resulting from an incomplete and maladjusted [at the least dual] marriage between the signs/entities that originate them, this interval may destabilize and cause disorder in these entities and their own meanings. The interstitials offer a hyphenated relationship: a simultaneous union and separation, a spacing between the entity’s identity and its otherness or, alterity. One and the other may no longer be seen without the crack or fissure that now separates them, uniting, by a space-time lapse. Ontological, semantic shifts are caused by this fissure, an absence between one and the other, one with and against the other. Based on an improbable approximation between some conceptual and semantic shifts within the design production of architect Rem Koolhaas and the textual production of the philosopher Jacques Derrida, this article questions the notion of unity, coherence, affinity, and complementarity in the process of construction of thought from these ontological, epistemological, and semiological fissures that rattle the signs/entities and their stable meanings. Fissures in a thought that is considered coherent, cohesive, formatted are the negativity that constitutes the interstices that allow us to move towards what still remains as non-identity, which allows us to begin another story.

Keywords: clearing, interstice, negative, remnant, spectrum

Procedia PDF Downloads 120
953 Evaluation of Hepatic Metabolite Changes for Differentiation Between Non-Alcoholic Steatohepatitis and Simple Hepatic Steatosis Using Long Echo-Time Proton Magnetic Resonance Spectroscopy

Authors: Tae-Hoon Kim, Kwon-Ha Yoon, Hong Young Jun, Ki-Jong Kim, Young Hwan Lee, Myeung Su Lee, Keum Ha Choi, Ki Jung Yun, Eun Young Cho, Yong-Yeon Jeong, Chung-Hwan Jun

Abstract:

Purpose: To assess the changes of hepatic metabolite for differentiation between non-alcoholic steatohepatitis (NASH) and simple steatosis on proton magnetic resonance spectroscopy (1H-MRS) in both humans and animal model. Methods: The local institutional review board approved this study and subjects gave written informed consent. 1H-MRS measurements were performed on a localized voxel of the liver using a point-resolved spectroscopy (PRESS) sequence and hepatic metabolites of alanine (Ala), lactate/triglyceride (Lac/TG), and TG were analyzed in NASH, simple steatosis and control groups. The group difference was tested with the ANOVA and Tukey’s post-hoc tests, and diagnostic accuracy was tested by calculating the area under the receiver operating characteristics (ROC) curve. The associations between metabolic concentration and pathologic grades or non-alcoholic fatty liver disease(NAFLD) activity scores were assessed by the Pearson’s correlation. Results: Patient with NASH showed the elevated Ala(p<0.001), Lac/TG(p < 0.001), TG(p < 0.05) concentration when compared with patients who had simple steatosis and healthy controls. The NASH patients were higher levels in Ala(mean±SEM, 52.5±8.3 vs 2.0±0.9; p < 0.001), Lac/TG(824.0±168.2 vs 394.1±89.8; p < 0.05) than simple steatosis. The area under the ROC curve to distinguish NASH from simple steatosis was 1.00 (95% confidence interval; 1.00, 1.00) with Ala and 0.782 (95% confidence interval; 0.61, 0.96) with Lac/TG. The Ala and Lac/TG levels were well correlated with steatosis grade, lobular inflammation, and NAFLD activity scores. The metabolic changes in human were reproducible to a mice model induced by streptozotocin injection and a high-fat diet. Conclusion: 1H-MRS would be useful for differentiation of patients with NASH and simple hepatic steatosis.

Keywords: non-alcoholic fatty liver disease, non-alcoholic steatohepatitis, 1H MR spectroscopy, hepatic metabolites

Procedia PDF Downloads 313
952 Important Factors Affecting the Effectiveness of Quality Control Circles

Authors: Sogol Zarafshan

Abstract:

The present study aimed to identify important factors affecting the effectiveness of quality control circles in a hospital, as well as rank them using a combination of fuzzy VIKOR and Grey Relational Analysis (GRA). The study population consisted of five academic members and five experts in the field of nursing working in a hospital, who were selected using a purposive sampling method. Also, a sample of 107 nurses was selected through a simple random sampling method using their employee codes and the random-number table. The required data were collected using a researcher-made questionnaire which consisted of 12 factors. The validity of this questionnaire was confirmed through giving the opinions of experts and academic members who participated in the present study, as well as performing confirmatory factor analysis. Its reliability also was verified (α=0.796). The collected data were analyzed using SPSS 22.0 and LISREL 8.8, as well as VIKOR–GRA and IPA methods. The results of ranking the factors affecting the effectiveness of quality control circles showed that the highest and lowest ranks were related to ‘Managers’ and supervisors’ support’ and ‘Group leadership’. Also, the highest hospital performance was for factors such as ‘Clear goals and objectives’ and ‘Group cohesiveness and homogeneity’, and the lowest for ‘Reward system’ and ‘Feedback system’, respectively. The results showed that although ‘Training the members’, ‘Using the right tools’ and ‘Reward system’ were factors that were of great importance, the organization’s performance for these factors was poor. Therefore, these factors should be paid more attention by the studied hospital managers and should be improved as soon as possible.

Keywords: Quality control circles, Fuzzy VIKOR, Grey Relational Analysis, Importance–Performance Analysis

Procedia PDF Downloads 117
951 Effect of Mercerization on Coconut Fiber Surface Condition

Authors: Sphiwe Simelane, Daniel Madyira

Abstract:

The use of natural fibers requires that they should be treated in preparation for their use in Natural Fiber-reinforced polymer composites. This paper reports on the effects of sodium hydroxide (NaOH) treatment on the surface of coconut fibers. The fibers were subjected to 5%, 10%, 15% and 20% NaOH concentrations and soaked for 4 hours and thoroughly rinsed and allowed to dry in the open air for seven days, after which time they were dried in an oven for 30 minutes. Untreated and treated coconut fibers were observed under the Scanning Electron Microscope and it was noted that the surface structure of the fibers was modified differently by the different NaOH concentrations, and the resultant colour of the treated fibers got darker as the solution concentration increased, and the texture felt rougher to the touch as a result of the erosion of the fiber surface. Further, the increase in alkali concentration striped the surface of more constituents, thus exposing “pits” and other surface components rendering the surface rough.

Keywords: coconut fiber, scanning electron microscope, sodium hydroxide, surface treatment

Procedia PDF Downloads 184
950 Postoperative Budesonide Nasal Irrigation vs Normal Saline Irrigation for Chronic Rhinosinusitis: A Systematic Review and Meta-Analysis

Authors: Rakan Hassan M. Alzahrani, Ziyad Alzahrani, Bader Bashrahil, Abdulrahman Elyasi, Abdullah a Ghaddaf, Rayan Alzahrani, Mohammed Alkathlan, Nawaf Alghamdi, Dakheelallah Almutairi

Abstract:

Background: Corticosteroid irrigations, which regularly involve the off-label use of budesonide mixed with normal saline in high volume Sino-nasal irrigations, have been more commonly used in the management of post-operative chronic rhinosinusitis (CRS). Objective: This article attempted to measure the efficacy of post-operative budesonide nasal irrigation compared to normal saline-alone nasal irrigation in the management of chronic rhinosinusitis (CRS) through a systematic review and meta-analysis of randomized controlled trials (RCTs). Methods: The databases PubMed, Embase, and Cochrane Central Register of Controlled Trials were searched by two independent authors. Only RCTs comparing budesonide irrigation to normal saline alone irrigation for CRS with or without polyposis after functional endoscopic sinus surgery (FESS) were eligible. A random effect analysis model of the reported CRS-related quality of life (QOL) measures and the objective endoscopic assessment scales of the disease was done. Results: Only 6 RCTs met the eligibility criteria, with a total number of participants of 356. Compared to normal saline irrigation, budesonide nasal irrigation showed statically significant improvements in both the CRS-related quality of life (QOL) and the endoscopic findings (MD= -4.22 confidence interval [CI]: -5.63, -2.82 [P < 0.00001]), (SMD= -0.50 confidence interval [CI]: -0.93, -0.06 [P < 0.03]) respectively. Conclusion: Both intervention arms showed improvements in CRS-related QOL and endoscopic findings in post-FESS chronic rhinosinusitis with or without polyposis. However, budesonide irrigation seems to have a slight edge over conventional normal saline irrigation with no reported serious side effects, including hypothalamic-pituitary-adrenal (HPA) axis suppression.

Keywords: Budesonide, chronic rhinosinusitis, corticosteroids, nasal irrigation, normal saline

Procedia PDF Downloads 60
949 Refining Scheme Using Amphibious Epistemologies

Authors: David Blaine, George Raschbaum

Abstract:

The evaluation of DHCP has synthesized SCSI disks, and current trends suggest that the exploration of e-business that would allow for further study into robots will soon emerge. Given the current status of embedded algorithms, hackers worldwide obviously desire the exploration of replication, which embodies the confusing principles of programming languages. In our research we concentrate our efforts on arguing that erasure coding can be made "fuzzy", encrypted, and game-theoretic.

Keywords: SCHI disks, robot, algorithm, hacking, programming language

Procedia PDF Downloads 403
948 Recent Developments in the Application of Deep Learning to Stock Market Prediction

Authors: Shraddha Jain Sharma, Ratnalata Gupta

Abstract:

Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.

Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume

Procedia PDF Downloads 69
947 Using Derivative Free Method to Improve the Error Estimation of Numerical Quadrature

Authors: Chin-Yun Chen

Abstract:

Numerical integration is an essential tool for deriving different physical quantities in engineering and science. The effectiveness of a numerical integrator depends on different factors, where the crucial one is the error estimation. This work presents an error estimator that combines a derivative free method to improve the performance of verified numerical quadrature.

Keywords: numerical quadrature, error estimation, derivative free method, interval computation

Procedia PDF Downloads 442
946 Overview of Time, Resource and Cost Planning Techniques in Construction Management Research

Authors: R. Gupta, P. Jain, S. Das

Abstract:

One way to approach construction scheduling optimization problem is to focus on the individual aspects of planning, which can be broadly classified as time scheduling, crew and resource management, and cost control. During the last four decades, construction planning has seen a lot of research, but to date, no paper had attempted to summarize the literature available under important heads. This paper addresses each of aspects separately, and presents the findings of an in-depth literature of the various planning techniques. For techniques dealing with time scheduling, the authors have adopted a rough chronological documentation. For crew and resource management, classification has been done on the basis of the different steps involved in the resource planning process. For cost control, techniques dealing with both estimation of costs and the subsequent optimization of costs have been dealt with separately.

Keywords: construction planning techniques, time scheduling, resource planning, cost control

Procedia PDF Downloads 467
945 Surface Modified Electrospun Expanded Polystyrene Fibre with Superhydrophobic/Superoleophillic Properties as Potential Oil Membrane

Authors: S. Oluwagbemiga Alayande, E. Olugbenga Dare, Titus A. M. Msagati, A. Kehinde Akinlabi , P. O. Aiyedun

Abstract:

This paper presents a cheap route procedure for the preparation of a potential oil membrane with superhydrophobic /superoleophillic properties for selective removal of crude oil from water. In these study, expanded polystyrene (EPS) was electrospun to produce beaded fibers in which zeolite was introduced to the polymer matrix in order to impart rough surface to non-beaded fiber. Films of the EPS and EPS/Zeolite solutions were also made for comparative study. The electrospun fibers EPS, EPS/Zeolite and resultant films were characterized using SEM, BET, FTIR and optical contact angle. The fibers exhibited superhydrophic and superoleophillic wetting properties with water and crude oil. The selective removal of crude oil presents new opportunity for the re-use of EPS as adsorbent in petroleum/petrochemical industry.

Keywords: expanded polystyrene, superhydrophobic, superoleophillic, oil-membrane

Procedia PDF Downloads 448
944 The Effect of Multiple Environmental Conditions on Acacia senegal Seedling’s Carbon, Nitrogen, and Hydrogen Contents: An Experimental Investigation

Authors: Abdelmoniem A. Attaelmanan, Ahmed A. H. Siddig

Abstract:

This study was conducted in light of continual global climate changes that projected increasing aridity, changes in soil fertility, and pollution. Plant growth and development largely depend on the combination of availing water and nutrients in the soil. Changes in the climate and atmospheric chemistry can cause serious effects on these growth factors. Plant carbon (C), nitrogen (N), and hydrogen (H) play a fundamental role in the maintenance of ecosystem structure and function. Hashab (Acacia senegal), which produces gum Arabic, supports dryland ecosystems in tropical zones by its potentiality to restore degraded soils; hence it is ecologically and economically important for the dry areas of sub-Saharan Africa. The study aims at investigating the effects of water stress (simulated drought) and poor soil type on Acacia senegal C, N, and H contents. Seven days old seedlings were assigned to the treatments in Split- plot design for four weeks. The main plot is irrigation interval (well-watered and water-stressed), and the subplot is soil types (silt and sand soils). Seedling's C%, N%, and H% were measured using CHNS-O Analyzer and applying Standard Test Method. Irrigation intervals and soil types had no effects on seedlings and leaves C%, N%, and H%, irrigation interval had affected stem C and H%, both irrigation intervals and soil types had affected root N% and interaction effect of water and soil was found on leaves and root's N%. Synthesis application of well-watered irrigation with soil that is rich in N and other nutrients would result in the greatest seedling C, N, and H content which will enhance growth and biomass accumulation and can play a crucial role in ecosystem productivity and services in the dryland regions.

Keywords: Acacia senegal, Africa, climate change, drylands, nutrients biomass, Sub-Saharan, Sudan

Procedia PDF Downloads 97
943 Low-Cost Image Processing System for Evaluating Pavement Surface Distress

Authors: Keerti Kembhavi, M. R. Archana, V. Anjaneyappa

Abstract:

Most asphalt pavement condition evaluation use rating frameworks in which asphalt pavement distress is estimated by type, extent, and severity. Rating is carried out by the pavement condition rating (PCR), which is tedious and expensive. This paper presents the development of a low-cost technique for image pavement distress analysis that permits the identification of pothole and cracks. The paper explores the application of image processing tools for the detection of potholes and cracks. Longitudinal cracking and pothole are detected using Fuzzy-C- Means (FCM) and proceeded with the Spectral Theory algorithm. The framework comprises three phases, including image acquisition, processing, and extraction of features. A digital camera (Gopro) with the holder is used to capture pavement distress images on a moving vehicle. FCM classifier and Spectral Theory algorithms are used to compute features and classify the longitudinal cracking and pothole. The Matlab2016Ra Image preparing tool kit utilizes performance analysis to identify the viability of pavement distress on selected urban stretches of Bengaluru city, India. The outcomes of image evaluation with the utilization semi-computerized image handling framework represented the features of longitudinal crack and pothole with an accuracy of about 80%. Further, the detected images are validated with the actual dimensions, and it is seen that dimension variability is about 0.46. The linear regression model y=1.171x-0.155 is obtained using the existing and experimental / image processing area. The R2 correlation square obtained from the best fit line is 0.807, which is considered in the linear regression model to be ‘large positive linear association’.

Keywords: crack detection, pothole detection, spectral clustering, fuzzy-c-means

Procedia PDF Downloads 164
942 Improvement on a CNC Gantry Machine Structure Design for Higher Machining Speed Capability

Authors: Ahmed A. D. Sarhan, S. R. Besharaty, Javad Akbaria, M. Hamdi

Abstract:

The capability of CNC gantry milling machines in manufacturing long components has caused the expanded use of such machines. On the other hand, the machines’ gantry rigidity can reduce under severe loads or vibration during operation. Indeed, the quality of machining is dependent on the machine’s dynamic behavior throughout the operating process. For this reason, this type of machines has always been used prudently and are non efficient. Therefore, they can usually be employed for rough machining and may not produce adequate surface finishing. In this paper, a CNC gantry milling machine with the potential to produce good surface finish has been designed and analyzed. The lowest natural frequency of this machine is 202 Hz at all motion amplitudes with a full range of suitable frequency responses. Meanwhile, the maximum deformation under dead loads for the gantry machine is 0.565µm, indicating that this machine tool is capable of producing higher product quality.

Keywords: frequency response, finite element, gantry machine, gantry design, static and dynamic analysis

Procedia PDF Downloads 336
941 Linguistic Summarization of Structured Patent Data

Authors: E. Y. Igde, S. Aydogan, F. E. Boran, D. Akay

Abstract:

Patent data have an increasingly important role in economic growth, innovation, technical advantages and business strategies and even in countries competitions. Analyzing of patent data is crucial since patents cover large part of all technological information of the world. In this paper, we have used the linguistic summarization technique to prove the validity of the hypotheses related to patent data stated in the literature.

Keywords: data mining, fuzzy sets, linguistic summarization, patent data

Procedia PDF Downloads 254
940 Recurrence of Pterygium after Surgery and the Effect of Surgical Technique on the Recurrence of Pterygium in Patients with Pterygium

Authors: Luksanaporn Krungkraipetch

Abstract:

A pterygium is an eye surface lesion that begins in the limbal conjunctiva and progresses to the cornea. The lesion is more common in the nasal limbus than in the temporal, and it has a distinctive wing-like aspect. Indications for surgery, in decreasing order of significance, are grown over the corneal center, decreased vision due to corneal deformation, documented growth, sensations of discomfort, and aesthetic concerns. Recurrent pterygium results in the loss of time, the expense of therapy, and the potential for vision impairment. The objective of this study is to find out how often the recurrence of pterygium after surgery occurs, what effect the surgery technique has, and what causes them to come back in people with pterygium. Materials and Methods: Observational case control in retrospect: the study involves a retrospective analysis of 164 patient samples. Data analysis is descriptive statistics analysis, i.e., basic data details about pterygium surgery and the risk of recurrent pterygium. For factor analysis, the inferential statistics odds ratio (OR) and 95% confidence interval (CI) ANOVA are utilized. A p-value of 0.05 was deemed statistically important. Results: The majority of patients, according to the results, were female (60.4%). Twenty-four of the 164 (14.6%) patients who underwent surgery exhibited recurrent pterygium. The average age is 55.33 years old. Postoperative recurrence was reported in 19 cases (79.3%) of bare sclera techniques and five cases (20.8%) of conjunctival autograft techniques. The recurrence interval is 10.25 months, with the most common (54.17 percent) being 12 months. In 91.67 percent of cases, all follow-ups are successful. The most common recurrence level is 1 (25%). A surgical complication is a subconjunctival hemorrhage (33.33 percent). Comparing the surgeries done on people with recurrent pterygium didn't show anything important (F = 1.13, p = 0.339). Age significantly affected the recurrence of pterygium (95% CI, 6.79-63.56; OR = 20.78, P 0.001). Conclusion: This study discovered a 14.6% rate of pterygium recurrence after pterygium surgery. Across all surgeries and patients, the rate of recurrence was four times higher with the bare sclera method than with conjunctival autograft. The researchers advise selecting a more conventional surgical technique to avoid a recurrence.

Keywords: pterygium, recurrence pterygium, pterygium surgery, excision pterygium

Procedia PDF Downloads 77
939 Estimation of Fragility Curves Using Proposed Ground Motion Selection and Scaling Procedure

Authors: Esra Zengin, Sinan Akkar

Abstract:

Reliable and accurate prediction of nonlinear structural response requires specification of appropriate earthquake ground motions to be used in nonlinear time history analysis. The current research has mainly focused on selection and manipulation of real earthquake records that can be seen as the most critical step in the performance based seismic design and assessment of the structures. Utilizing amplitude scaled ground motions that matches with the target spectra is commonly used technique for the estimation of nonlinear structural response. Representative ground motion ensembles are selected to match target spectrum such as scenario-based spectrum derived from ground motion prediction equations, Uniform Hazard Spectrum (UHS), Conditional Mean Spectrum (CMS) or Conditional Spectrum (CS). Different sets of criteria exist among those developed methodologies to select and scale ground motions with the objective of obtaining robust estimation of the structural performance. This study presents ground motion selection and scaling procedure that considers the spectral variability at target demand with the level of ground motion dispersion. The proposed methodology provides a set of ground motions whose response spectra match target median and corresponding variance within a specified period interval. The efficient and simple algorithm is used to assemble the ground motion sets. The scaling stage is based on the minimization of the error between scaled median and the target spectra where the dispersion of the earthquake shaking is preserved along the period interval. The impact of the spectral variability on nonlinear response distribution is investigated at the level of inelastic single degree of freedom systems. In order to see the effect of different selection and scaling methodologies on fragility curve estimations, results are compared with those obtained by CMS-based scaling methodology. The variability in fragility curves due to the consideration of dispersion in ground motion selection process is also examined.

Keywords: ground motion selection, scaling, uncertainty, fragility curve

Procedia PDF Downloads 570
938 Comprehensive Risk Analysis of Decommissioning Activities with Multifaceted Hazard Factors

Authors: Hyeon-Kyo Lim, Hyunjung Kim, Kune-Woo Lee

Abstract:

Decommissioning process of nuclear facilities can be said to consist of a sequence of problem solving activities, partly because there may exist working environments contaminated by radiological exposure, and partly because there may also exist industrial hazards such as fire, explosions, toxic materials, and electrical and physical hazards. As for an individual hazard factor, risk assessment techniques are getting known to industrial workers with advance of safety technology, but the way how to integrate those results is not. Furthermore, there are few workers who experienced decommissioning operations a lot in the past. Therefore, not a few countries in the world have been trying to develop appropriate counter techniques in order to guarantee safety and efficiency of the process. In spite of that, there still exists neither domestic nor international standard since nuclear facilities are too diverse and unique. In the consequence, it is quite inevitable to imagine and assess the whole risk in the situation anticipated one by one. This paper aimed to find out an appropriate technique to integrate individual risk assessment results from the viewpoint of experts. Thus, on one hand the whole risk assessment activity for decommissioning operations was modeled as a sequence of individual risk assessment steps, and on the other, a hierarchical risk structure was developed. Then, risk assessment procedure that can elicit individual hazard factors one by one were introduced with reference to the standard operation procedure (SOP) and hierarchical task analysis (HTA). With an assumption of quantification and normalization of individual risks, a technique to estimate relative weight factors was tried by using the conventional Analytic Hierarchical Process (AHP) and its result was reviewed with reference to judgment of experts. Besides, taking the ambiguity of human judgment into consideration, debates based upon fuzzy inference was added with a mathematical case study.

Keywords: decommissioning, risk assessment, analytic hierarchical process (AHP), fuzzy inference

Procedia PDF Downloads 411
937 Numerical Simulation of the Kurtosis Effect on the EHL Problem

Authors: S. Gao, S. Srirattayawong

Abstract:

In this study, a computational fluid dynamics (CFD) model has been developed for studying the effect of surface roughness profile on the EHL problem. The cylinders contact geometry, meshing and calculation of the conservation of mass and momentum equations are carried out by using the commercial software packages ICEMCFD and ANSYS Fluent. The user defined functions (UDFs) for density, viscosity and elastic deformation of the cylinders as the functions of pressure and temperature have been defined for the CFD model. Three different surface roughness profiles are created and incorporated into the CFD model. It is found that the developed CFD model can predict the characteristics of fluid flow and heat transfer in the EHL problem, including the leading parameters such as the pressure distribution, minimal film thickness, viscosity, and density changes. The obtained results show that the pressure profile at the center of the contact area directly relates to the roughness amplitude. The rough surface with kurtosis value over 3 influences the fluctuated shape of pressure distribution higher than other cases.

Keywords: CFD, EHL, kurtosis, surface roughness

Procedia PDF Downloads 300
936 Effectiveness of High-Intensity Interval Training in Overweight Individuals between 25-45 Years of Age Registered in Sports Medicine Clinic, General Hospital Kalutara

Authors: Dimuthu Manage

Abstract:

Introduction: The prevalence of obesity and obesity-related non-communicable diseases are becoming a massive health concern in the whole world. Physical activity is recognized as an effective solution for this matter. The published data on the effectiveness of High-Intensity Interval Training (HIIT) in improving health parameters in overweight and obese individuals in Sri Lanka is sparse. Hence this study is conducted. Methodology: This is a quasi-experimental study that was conducted at the Sports medicine clinic, General Hospital, Kalutara. Participants have engaged in a programme of HIIT three times per week for six weeks. Data collection was based on precise measurements by using structured and validated methods. Ethical clearance was obtained. Results: Registered number for the study was 48, and only 52% have completed the study. The mean age was 32 (SD=6.397) years, with 64% males. All the anthropometric measurements which were assessed (i.e. waist circumference(P<0.001), weight(P<0.001) and BMI(P<0.001)), body fat percentage(P<0.001), VO2 max(P<0.001), and lipid profile (ie. HDL(P=0.016), LDL(P<0.001), cholesterol(P<0.001), triglycerides(P<0.010) and LDL: HDL(P<0.001)) had shown statistically significant improvement after the intervention with the HIIT programme. Conclusions: This study confirms HIIT as a time-saving and effective exercise method, which helps in preventing obesity as well as non-communicable diseases. HIIT ameliorates body anthropometry, fat percentage, cardiopulmonary status, and lipid profile in overweight and obese individuals markedly. As with the majority of studies, the design of the current study is subject to some limitations. The first is the study focused on a correlational study. If it is a comparative study, comparing it with other methods of training programs would have given more validity. Although the validated tools used to measure variables and the same tools used in pre and post-exercise occasions with the available facilities, it would have been better to measure some of them using gold-standard methods. However, this evidence should be further assessed in larger-scale trials using comparative groups to generalize the efficacy of the HIIT exercise program.

Keywords: HIIT, lipid profile, BMI, VO2 max

Procedia PDF Downloads 53
935 Modelling of Atomic Force Microscopic Nano Robot's Friction Force on Rough Surfaces

Authors: M. Kharazmi, M. Zakeri, M. Packirisamy, J. Faraji

Abstract:

Micro/Nanorobotics or manipulation of nanoparticles by Atomic Force Microscopic (AFM) is one of the most important solutions for controlling the movement of atoms, particles and micro/nano metrics components and assembling of them to design micro/nano-meter tools. Accurate modelling of manipulation requires identification of forces and mechanical knowledge in the Nanoscale which are different from macro world. Due to the importance of the adhesion forces and the interaction of surfaces at the nanoscale several friction models were presented. In this research, friction and normal forces that are applied on the AFM by using of the dynamic bending-torsion model of AFM are obtained based on Hurtado-Kim friction model (HK), Johnson-Kendall-Robert contact model (JKR) and Greenwood-Williamson roughness model (GW). Finally, the effect of standard deviation of asperities height on the normal load, friction force and friction coefficient are studied.

Keywords: atomic force microscopy, contact model, friction coefficient, Greenwood-Williamson model

Procedia PDF Downloads 186
934 Normal Spectral Emissivity of Roughened Aluminum Alloy AL 6061 Surfaces at High Temperature

Authors: Sumeet Kumar, C. V. Krishnamurthy, Krishnan Balasubramaniam

Abstract:

Normal spectral emissivity of Al 6061 alloys with different surface finishes was experimentally measured at 833°K. Four different samples were prepared by polishing the surfaces of the alloy by 80, 220, 600 grit sizes of SiC abrasive papers and diamond paste. The samples were heated in air for 6 h at 833°K, and the emissivity was measured during the process from pyrometers operating at wavelengths of 3.9, 5.14 and 7.8 μm. The results indicated that the emissivity was increasing with heating time and the rate of increase was rapid during the initial stage of heating in comparison with the later stage. This appears to be because of the parabolic rate law followed by the process of oxidation. Further, it is found that the increase in emissivity with heating time was higher for rough surfaces than that for polished surfaces. Both the results were analyzed at all the three wavelengths, and qualitatively similar results were obtained for all of them. In this way emissivity of the alloy can be increased by roughening the surfaces and heating it at high temperature until the surfaces are oxidized.

Keywords: aluminum alloy, high temperature, normal spectral emissivity, surface roughness

Procedia PDF Downloads 198
933 Using Support Vector Machines for Measuring Democracy

Authors: Tommy Krieger, Klaus Gruendler

Abstract:

We present a novel approach for measuring democracy, which enables a very detailed and sensitive index. This method is based on Support Vector Machines, a mathematical algorithm for pattern recognition. Our implementation evaluates 188 countries in the period between 1981 and 2011. The Support Vector Machines Democracy Index (SVMDI) is continuously on the 0-1-Interval and robust to variations in the numerical process parameters. The algorithm introduced here can be used for every concept of democracy without additional adjustments, and due to its flexibility it is also a valuable tool for comparison studies.

Keywords: democracy, democracy index, machine learning, support vector machines

Procedia PDF Downloads 355
932 Sliding Velocity in Impact with Friction in Three-Dimensional Multibody Systems

Authors: Hesham A. Elkaranshawy, Amr Abdelrazek, Hosam Ezzat

Abstract:

This paper analyzes a single point rough collision in three dimensional rigid-multibody systems. A set of nonlinear different equations describing the progress and outcome of the impact are obtained. Specifically in case of the tangential, referred to as sliding, component of impact velocity is of great importance. Numerical methods are used to solve this problem. In this work, all these possible sliding behaviors during impact are identified, conditions leading to each behavior are specified, and an appropriate numerical procedure is suggested. A case of a four-degrees-of-freedom spatial robot that collides with its environment is investigated. The phase portrait of the tangential velocity, which presents the flow trajectories for different initial conditions, is calculated. Using the coefficient of friction as a control parameter, few phase portraits are drawn, each for a specific value of this coefficient. In addition, the bifurcation associated with the variation of this coefficient will be investigated.

Keywords: friction impact, three-dimensional rigid multibody systems, sliding velocity, nonlinear ordinary differential equations, phase portrait

Procedia PDF Downloads 370
931 Effect of the Workpiece Position on the Manufacturing Tolerances

Authors: Rahou Mohamed , Sebaa Fethi, Cheikh Abdelmadjid

Abstract:

Manufacturing tolerancing is intended to determine the intermediate geometrical and dimensional states of the part during its manufacturing process. These manufacturing dimensions also serve to satisfy not only the functional requirements given in the definition drawing but also the manufacturing constraints, for example geometrical defects of the machine, vibration, and the wear of the cutting tool. The choice of positioning has an important influence on the cost and quality of manufacture. To avoid this problem, a two-step approach have been developed. The first step is dedicated to the determination of the optimum position. As for the second step, a study was carried out for the tightening effect on the tolerance interval.

Keywords: dispersion, tolerance, manufacturing, position

Procedia PDF Downloads 321
930 Fracture And Fatigue Crack Growth Analysis and Modeling

Authors: Volkmar Nolting

Abstract:

Fatigue crack growth prediction has become an important topic in both engineering and non-destructive evaluation. Crack propagation is influenced by the mechanical properties of the material and is conveniently modelled by the Paris-Erdogan equation. The critical crack size and the total number of load cycles are calculated. From a Larson-Miller plot the maximum operational temperature can for a given stress level be determined so that failure does not occur within a given time interval t. The study is used to determine a reasonable inspection cycle and thus enhances operational safety and reduces costs.

Keywords: fracturemechanics, crack growth prediction, lifetime of a component, structural health monitoring

Procedia PDF Downloads 24
929 Calculation of Instrumental Results of the Tohoku Earthquake, Japan (Mw 9.0) on March 11, 2011 and Other Destructive Earthquakes during Seismic Hazard Assessment

Authors: J. K. Karapetyan

Abstract:

In this paper seismological-statistical analysis of actual instrumental data on the main tremor of the Great Japan earthquake 11.03.2011 is implemented for finding out the dependence between maximal values of peak ground accelerations (PGA) and epicentric distances. A number of peculiarities of manifestation of accelerations' maximum values at the interval of long epicentric distances are revealed which do not correspond with current scales of seismic intensity.

Keywords: earthquakes, instrumental records, seismic hazard, Japan

Procedia PDF Downloads 354
928 An Intelligent Controller Augmented with Variable Zero Lag Compensation for Antilock Braking System

Authors: Benjamin Chijioke Agwah, Paulinus Chinaenye Eze

Abstract:

Antilock braking system (ABS) is one of the important contributions by the automobile industry, designed to ensure road safety in such way that vehicles are kept steerable and stable when during emergency braking. This paper presents a wheel slip-based intelligent controller with variable zero lag compensation for ABS. It is required to achieve a very fast perfect wheel slip tracking during hard braking condition and eliminate chattering with improved transient and steady state performance, while shortening the stopping distance using effective braking torque less than maximum allowable torque to bring a braking vehicle to a stop. The dynamic of a vehicle braking with a braking velocity of 30 ms⁻¹ on a straight line was determined and modelled in MATLAB/Simulink environment to represent a conventional ABS system without a controller. Simulation results indicated that system without a controller was not able to track desired wheel slip and the stopping distance was 135.2 m. Hence, an intelligent control based on fuzzy logic controller (FLC) was designed with a variable zero lag compensator (VZLC) added to enhance the performance of FLC control variable by eliminating steady state error, provide improve bandwidth to eliminate the effect of high frequency noise such as chattering during braking. The simulation results showed that FLC- VZLC provided fast tracking of desired wheel slip, eliminate chattering, and reduced stopping distance by 70.5% (39.92 m), 63.3% (49.59 m), 57.6% (57.35 m) and 50% (69.13 m) on dry, wet, cobblestone and snow road surface conditions respectively. Generally, the proposed system used effective braking torque that is less than the maximum allowable braking torque to achieve efficient wheel slip tracking and overall robust control performance on different road surfaces.

Keywords: ABS, fuzzy logic controller, variable zero lag compensator, wheel slip tracking

Procedia PDF Downloads 132