Search results for: type-2 fuzzy sets
700 Water-in-Diesel Fuel Nanoemulsions Prepared by Modified Low Energy: Emulsion Drop Size and Stability, Physical Properties, and Emission Characteristics
Authors: M. R. Noor El-Din, Marwa R. Mishrif, R. E. Morsi, E. A. El-Sharaky, M. E. Haseeb, Rania T. M. Ghanem
Abstract:
This paper studies the physical and rheological behaviours of water/in/diesel fuel nanoemulsions prepared by modified low energy method. Twenty of water/in/diesel fuel nanoemulsions were prepared using mixed nonionic surfactants of sorbitan monooleate and polyoxyethylene sorbitan trioleate (MTS) at Hydrophilic-Lipophilic Balance (HLB) value of 10 and a working temperature of 20°C. The influence of the prepared nanoemulsions on the physical properties such as kinematic viscosity, density, and calorific value was studied. Also, nanoemulsion systems were subjected to rheological evaluation. The effect of water loading percentage (5, 6, 7, 8, 9 and 10 wt.%) on rheology was assessed at temperatures range from 20 to 60°C with temperature interval of 10 for time lapse 0, 1, 2 and 3 months, respectively. Results show that all of the sets nanoemulsions exhibited a Newtonian flow character of low-shear viscosity in the range of 132 up to 191 1/s, and followed by a shear-thinning region with yield value (Non-Newtonian behaviour) at high shear rate for all water ratios (5 to 10 wt.%) and at all test temperatures (20 to 60°C) for time ageing up to 3 months. Also, the viscosity/temperature relationship of all nanoemulsions fitted well Arrhenius equation with high correlation coefficients that ascertain their Newtonian behavior.Keywords: alternative fuel, nanoemulsion, surfactant, diesel fuel
Procedia PDF Downloads 313699 Information Management Approach in the Prediction of Acute Appendicitis
Authors: Ahmad Shahin, Walid Moudani, Ali Bekraki
Abstract:
This research aims at presenting a predictive data mining model to handle an accurate diagnosis of acute appendicitis with patients for the purpose of maximizing the health service quality, minimizing morbidity/mortality, and reducing cost. However, acute appendicitis is the most common disease which requires timely accurate diagnosis and needs surgical intervention. Although the treatment of acute appendicitis is simple and straightforward, its diagnosis is still difficult because no single sign, symptom, laboratory or image examination accurately confirms the diagnosis of acute appendicitis in all cases. This contributes in increasing morbidity and negative appendectomy. In this study, the authors propose to generate an accurate model in prediction of patients with acute appendicitis which is based, firstly, on the segmentation technique associated to ABC algorithm to segment the patients; secondly, on applying fuzzy logic to process the massive volume of heterogeneous and noisy data (age, sex, fever, white blood cell, neutrophilia, CRP, urine, ultrasound, CT, appendectomy, etc.) in order to express knowledge and analyze the relationships among data in a comprehensive manner; and thirdly, on applying dynamic programming technique to reduce the number of data attributes. The proposed model is evaluated based on a set of benchmark techniques and even on a set of benchmark classification problems of osteoporosis, diabetes and heart obtained from the UCI data and other data sources.Keywords: healthcare management, acute appendicitis, data mining, classification, decision tree
Procedia PDF Downloads 352698 Commissioning of a Flattening Filter Free (FFF) using an Anisotropic Analytical Algorithm (AAA)
Authors: Safiqul Islam, Anamul Haque, Mohammad Amran Hossain
Abstract:
Aim: To compare the dosimetric parameters of the flattened and flattening filter free (FFF) beam and to validate the beam data using anisotropic analytical algorithm (AAA). Materials and Methods: All the dosimetric data’s (i.e. depth dose profiles, profile curves, output factors, penumbra etc.) required for the beam modeling of AAA were acquired using the Blue Phantom RFA for 6 MV, 6 FFF, 10MV & 10FFF. Progressive resolution Optimizer and Dose Volume Optimizer algorithm for VMAT and IMRT were are also configured in the beam model. Beam modeling of the AAA were compared with the measured data sets. Results: Due to the higher and lover energy component in 6FFF and 10 FFF the surface doses are 10 to 15% higher compared to flattened 6 MV and 10 MV beams. FFF beam has a lower mean energy compared to the flattened beam and the beam quality index were 6 MV 0.667, 6FFF 0.629, 10 MV 0.74 and 10 FFF 0.695 respectively. Gamma evaluation with 2% dose and 2 mm distance criteria for the Open Beam, IMRT and VMAT plans were also performed and found a good agreement between the modeled and measured data. Conclusion: We have successfully modeled the AAA algorithm for the flattened and FFF beams and achieved a good agreement with the calculated and measured value.Keywords: commissioning of a Flattening Filter Free (FFF) , using an Anisotropic Analytical Algorithm (AAA), flattened beam, parameters
Procedia PDF Downloads 301697 Integrating and Evaluating Computational Thinking in an Undergraduate Marine Science Course
Authors: Dana Christensen
Abstract:
Undergraduate students, particularly in the environmental sciences, have difficulty displaying quantitative skills in their laboratory courses. Students spend time sampling in the field, often using new methods, and are expected to make sense of the data they collect. Computational thinking may be used to navigate these new experiences. We developed a curriculum for the marine science department at a small liberal arts college in the Northeastern United States based on previous computational thinking frameworks. This curriculum incorporates marine science data sets with specific objectives and topics selected by the faculty at the College. The curriculum was distributed to all students enrolled in introductory marine science classes as a mandatory module. Two pre-tests and post-tests will be used to quantitatively assess student progress on both content-based and computational principles. Student artifacts are being collected with each lesson to be coded for content-specific and computational-specific items in qualitative assessment. There is an overall gap in marine science education research, especially curricula that focus on computational thinking and associated quantitative assessment. The curricula itself, the assessments, and our results may be modified and applied to other environmental science courses due to the nature of the inquiry-based laboratory components that use quantitative skills to understand nature.Keywords: marine science, computational thinking, curriculum assessment, quantitative skills
Procedia PDF Downloads 59696 Fuzzy Nail Cream Formula Treatment with Basic Iranian Traditional Medicine
Authors: Elahe Najafizade, Ahmad Mohammad Alkhateeb, Seyed Ali Hossein Zahraei, Iman Dianat
Abstract:
Introduction: Hangnails are short, torn, down parts of the skin surrounding the nails. At times they are very painful. The usual treatment advised is cutting the excess skin with clippers or scissors. To provide instant relief to the patients, we describe a simpler and more effective way to use surgical glue to paste them back into their original position. Method: The cream should not be on the heat; it is on the bain-marie. To achieve the desired emulsifier, 1 gram of borax was mixed in 10 grams of distilled water in a bain-marie until it melted, then stirred oserin, beeswax, and oil in the bain-marie until it melted. After that, 32 grams of distilled water was added little by little. We add and stir and gradually add the borax dissolved in 10 grams of distilled water. The bowl of cream was placed in a bowl of cold water and stirred until the cream was smooth. After that, we add gasoline, alcohol, or methylparaben preservatives. It should be noted that this amount of ingredients is enough for a 350-gram can (when we prepare the cream, we also add the extract). Result: The patient was a 40-year-old female with a hangnail problem that had been used several different creams and Vaseline, but the treatment was not useful, but after this cream was applied for treatment; the hangnail started to cure within one week, and complete treatment achieved after two weeks. Conclusion: Traditional methods with modification without using chemical substances somehow work better and safer, so research programs on them will be useful for less risky treatment procedures.Keywords: nail, cream, formula, traditional medicine
Procedia PDF Downloads 114695 Estimating Knowledge Flow Patterns of Business Method Patents with a Hidden Markov Model
Authors: Yoonjung An, Yongtae Park
Abstract:
Knowledge flows are a critical source of faster technological progress and stouter economic growth. Knowledge flows have been accelerated dramatically with the establishment of a patent system in which each patent is required by law to disclose sufficient technical information for the invention to be recreated. Patent analysis, thus, has been widely used to help investigate technological knowledge flows. However, the existing research is limited in terms of both subject and approach. Particularly, in most of the previous studies, business method (BM) patents were not covered although they are important drivers of knowledge flows as other patents. In addition, these studies usually focus on the static analysis of knowledge flows. Some use approaches that incorporate the time dimension, yet they still fail to trace a true dynamic process of knowledge flows. Therefore, we investigate dynamic patterns of knowledge flows driven by BM patents using a Hidden Markov Model (HMM). An HMM is a popular statistical tool for modeling a wide range of time series data, with no general theoretical limit in regard to statistical pattern classification. Accordingly, it enables characterizing knowledge patterns that may differ by patent, sector, country and so on. We run the model in sets of backward citations and forward citations to compare the patterns of knowledge utilization and knowledge dissemination.Keywords: business method patents, dynamic pattern, Hidden-Markov Model, knowledge flow
Procedia PDF Downloads 328694 The Drama and Dynamics of Economic Shocks and Households Responses in Nigeria
Authors: Doki Naomi Onyeje, Doki Gowon Ama
Abstract:
The past 4 years have been traumatic for Nigerians, having to deal with a number of complex economic issues with dire consequences for the economy. Households have had to respond variously to some of these problems in peculiar ways, depending, of course, on the nature and character of a particular shock. The type, magnitude, intensity and duration of a particular shock might be the determinant of different household responses. While households’ responses to the Global Financial Crisis and Covid 19 Pandemic have been documented by researchers, other economic shocks have continued to emerge in Nigeria. The dramatic turn of events since coming on board of the new government on May 29th 2023, has introduced a new economic twist that households will have to adjust to. This study, therefore, sets out to examine household responses by disaggregating them by their livelihood sources. A survey of 420 households across North Central Nigeria will be done to generate information on the respective responses. A Multinomial logit regression analysis will be employed to test the hypothesis that livelihood source(s) influences household responses to economic shocks. Consequently, responses from public and private households will be examined. The expected results should be that household responses might have some similarities, but it is expected that some peculiar responses across groups will emerge and these differences will guide for group-specific interventions. The Theatre for Development (TfD) approach will be used to disseminate and propagate results from this study to and among stakeholders for effective policy frameworks.Keywords: drama, dynamics, economic shocks, household responses, Nigeria
Procedia PDF Downloads 75693 Managing the Transition from Voluntary to Mandatory Climate Reporting: The Role of Carbon Accounting
Authors: Qingliang Tang
Abstract:
The transition from voluntary to mandatory carbon reporting (also refers to climate reporting) poses serious challenges for accounting professionals aiming to support firms in achieving net-zero goals. The accounting literature addresses the topics that are currently bewildering accounting academics and professional accountants on how to make accounting as a useful tool for the management to achieve a carbon neutral business model. This paper explores the evolving role of carbon accounting within corporate financial reporting systems, emphasizing its integration as a crucial component. Key challenges addressed include data availability, climate risk assessment, defining reporting boundaries, selecting appropriate greenhouse gas (GHG) accounting methodologies, and integrating climate-related events into traditional financial statements. A dynamic, integrated carbon accounting framework is proposed to facilitate this transformative process effectively. Furthermore, the paper identifies critical knowledge gaps and sets forth a research agenda aimed at enhancing transparency and relevance in carbon accounting and reporting systems, thereby empowering informed decision-making. The purpose of the paper is to succinctly capture the essence of carbon accounting practice in the transitional period, focusing on the challenges, proposed solutions, and future research directions in the realm of carbon accounting and mandatory climate reporting.Keywords: mandatory carbon reporting, carbon management, net zero target, sustainability, climate risks
Procedia PDF Downloads 21692 Multivariate Analytical Insights into Spatial and Temporal Variation in Water Quality of a Major Drinking Water Reservoir
Authors: Azadeh Golshan, Craig Evans, Phillip Geary, Abigail Morrow, Zoe Rogers, Marcel Maeder
Abstract:
22 physicochemical variables have been determined in water samples collected weekly from January to December in 2013 from three sampling stations located within a major drinking water reservoir. Classical Multivariate Curve Resolution Alternating Least Squares (MCR-ALS) analysis was used to investigate the environmental factors associated with the physico-chemical variability of the water samples at each of the sampling stations. Matrix augmentation MCR-ALS (MA-MCR-ALS) was also applied, and the two sets of results were compared for interpretative clarity. Links between these factors, reservoir inflows and catchment land-uses were investigated and interpreted in relation to chemical composition of the water and their resolved geographical distribution profiles. The results suggested that the major factors affecting reservoir water quality were those associated with agricultural runoff, with evidence of influence on algal photosynthesis within the water column. Water quality variability within the reservoir was also found to be strongly linked to physical parameters such as water temperature and the occurrence of thermal stratification. The two methods applied (MCR-ALS and MA-MCR-ALS) led to similar conclusions; however, MA-MCR-ALS appeared to provide results more amenable to interpretation of temporal and geological variation than those obtained through classical MCR-ALS.Keywords: drinking water reservoir, multivariate analysis, physico-chemical parameters, water quality
Procedia PDF Downloads 291691 Approach for Demonstrating Reliability Targets for Rail Transport during Low Mileage Accumulation in the Field: Methodology and Case Study
Authors: Nipun Manirajan, Heeralal Gargama, Sushil Guhe, Manoj Prabhakaran
Abstract:
In railway industry, train sets are designed based on contractual requirements (mission profile), where reliability targets are measured in terms of mean distance between failures (MDBF). However, during the beginning of revenue services, trains do not achieve the designed mission profile distance (mileage) within the timeframe due to infrastructure constraints, scarcity of commuters or other operational challenges thereby not respecting the original design inputs. Since trains do not run sufficiently and do not achieve the designed mileage within the specified time, car builder has a risk of not achieving the contractual MDBF target. This paper proposes a constant failure rate based model to deal with the situations where mileage accumulation is not a part of the design mission profile. The model provides appropriate MDBF target to be demonstrated based on actual accumulated mileage. A case study of rolling stock running in the field is undertaken to analyze the failure data and MDBF target demonstration during low mileage accumulation. The results of case study prove that with the proposed method, reliability targets are achieved under low mileage accumulation.Keywords: mean distance between failures, mileage-based reliability, reliability target appropriations, rolling stock reliability
Procedia PDF Downloads 267690 Studies of Rule Induction by STRIM from the Decision Table with Contaminated Attribute Values from Missing Data and Noise — in the Case of Critical Dataset Size —
Authors: Tetsuro Saeki, Yuichi Kato, Shoutarou Mizuno
Abstract:
STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to realworld data.Keywords: rule induction, decision table, missing data, noise
Procedia PDF Downloads 396689 An Enhanced MEIT Approach for Itemset Mining Using Levelwise Pruning
Authors: Tanvi P. Patel, Warish D. Patel
Abstract:
Association rule mining forms the core of data mining and it is termed as one of the well-known methodologies of data mining. Objectives of mining is to find interesting correlations, frequent patterns, associations or casual structures among sets of items in the transaction databases or other data repositories. Hence, association rule mining is imperative to mine patterns and then generate rules from these obtained patterns. For efficient targeted query processing, finding frequent patterns and itemset mining, there is an efficient way to generate an itemset tree structure named Memory Efficient Itemset Tree. Memory efficient IT is efficient for storing itemsets, but takes more time as compare to traditional IT. The proposed strategy generates maximal frequent itemsets from memory efficient itemset tree by using levelwise pruning. For that firstly pre-pruning of items based on minimum support count is carried out followed by itemset tree reconstruction. By having maximal frequent itemsets, less number of patterns are generated as well as tree size is also reduced as compared to MEIT. Therefore, an enhanced approach of memory efficient IT proposed here, helps to optimize main memory overhead as well as reduce processing time.Keywords: association rule mining, itemset mining, itemset tree, meit, maximal frequent pattern
Procedia PDF Downloads 372688 Taylor’s Law and Relationship between Life Expectancy at Birth and Variance in Age at Death in Period Life Table
Authors: David A. Swanson, Lucky M. Tedrow
Abstract:
Taylor’s Law is a widely observed empirical pattern that relates variances to means in sets of non-negative measurements via an approximate power function, which has found application to human mortality. This study adds to this research by showing that Taylor’s Law leads to a model that reasonably describes the relationship between life expectancy at birth (e0, which also is equal to mean age at death in a life table) and variance at age of death in seven World Bank regional life tables measured at two points in time, 1970 and 2000. Using as a benchmark a non-random sample of four Japanese female life tables covering the period from 1950 to 2004, the study finds that the simple linear model provides reasonably accurate estimates of variance in age at death in a life table from e0, where the latter range from 60.9 to 85.59 years. Employing 2017 life tables from the Human Mortality Database, the simple linear model is used to provide estimates of variance at age in death for six countries, three of which have high e0 values and three of which have lower e0 values. The paper provides a substantive interpretation of Taylor’s Law relative to e0 and concludes by arguing that reasonably accurate estimates of variance in age at death in a period life table can be calculated using this approach, which also can be used where e0 itself is estimated rather than generated through the construction of a life table, a useful feature of the model.Keywords: empirical pattern, mean age at death in a life table, mean age of a stationary population, stationary population
Procedia PDF Downloads 330687 Unveiling Electrical Treeing Mechanisms in Epoxy Resin Insulation Degradation
Authors: Chien-Kuo Chang, You-Syuan Wu, Min-Chiu Wu, Bharath-Kumar Boyanapalli
Abstract:
The electrical treeing mechanism in epoxy resin insulation is a critical area of study concerning the degradation of high-voltage electrical equipment. In this study, we conducted pressure-induced degradation experiments on epoxy resin specimens using a needle-plane electrode structure to simulate electrical treeing. The specimens featured two different defect spacings, allowing for detailed observation facilitated by time-lapse photography. Our investigation revealed four distinct stages of insulation degradation: initial dark tree growth, filamentary tree growth, reverse tree growth, and eventual insulation breakdown. The initial dark treeing stage, though shortest in duration, exhibited a thicker main branch and shorter branching, ceasing upon the appearance of filamentary treeing. Filamentary treeing manifested in two forms: dark filamentary treeing during the resin's glassy state, characterized by branching structures, and fuzzy filamentary treeing during the rubbery state, resembling white feathers. The channels formed by filamentary treeing were observed to be as narrow as a few micrometers and continued to grow until the end of the experiment. Additionally, the transition to reverse treeing occurred when filamentary treeing reached the ground electrode, with the earliest manifestation being growth from the ground electrode towards the high-voltage end.Keywords: epoxy resin insulation, high-voltage equipment, electrical treeing mechanism
Procedia PDF Downloads 77686 Signal Processing of Barkhausen Noise Signal for Assessment of Increasing Down Feed in Surface Ground Components with Poor Micro-Magnetic Response
Authors: Tanmaya Kumar Dash, Tarun Karamshetty, Soumitra Paul
Abstract:
The Barkhausen Noise Analysis (BNA) technique has been utilized to assess surface integrity of steels. But the BNA technique is not very successful in evaluating surface integrity of ground steels that exhibit poor micro-magnetic response. A new approach has been proposed for the processing of BN signal with Fast Fourier transforms while Wavelet transforms has been used to remove noise from the BN signal, with judicious choice of the ‘threshold’ value, when the micro-magnetic response of the work material is poor. In the present study, the effect of down feed induced upon conventional plunge surface grinding of hardened bearing steel has been investigated along with an ultrasonically cleaned, wet polished and a sample ground with spark out technique for benchmarking. Moreover, the FFT analysis has been established, at different sets of applied voltages and applied frequency and the pattern of the BN signal in the frequency domain is analyzed. The study also depicts the wavelet transforms technique with different levels of decomposition and different mother wavelets, which has been used to reduce the noise value in BN signal of materials with poor micro-magnetic response, in order to standardize the procedure for all BN signals depending on the frequency of the applied voltage.Keywords: barkhausen noise analysis, grinding, magnetic properties, signal processing, micro-magnetic response
Procedia PDF Downloads 668685 Second Order Optimality Conditions in Nonsmooth Analysis on Riemannian Manifolds
Authors: Seyedehsomayeh Hosseini
Abstract:
Much attention has been paid over centuries to understanding and solving the problem of minimization of functions. Compared to linear programming and nonlinear unconstrained optimization problems, nonlinear constrained optimization problems are much more difficult. Since the procedure of finding an optimizer is a search based on the local information of the constraints and the objective function, it is very important to develop techniques using geometric properties of the constraints and the objective function. In fact, differential geometry provides a powerful tool to characterize and analyze these geometric properties. Thus, there is clearly a link between the techniques of optimization on manifolds and standard constrained optimization approaches. Furthermore, there are manifolds that are not defined as constrained sets in R^n an important example is the Grassmann manifolds. Hence, to solve optimization problems on these spaces, intrinsic methods are used. In a nondifferentiable problem, the gradient information of the objective function generally cannot be used to determine the direction in which the function is decreasing. Therefore, techniques of nonsmooth analysis are needed to deal with such a problem. As a manifold, in general, does not have a linear structure, the usual techniques, which are often used in nonsmooth analysis on linear spaces, cannot be applied and new techniques need to be developed. This paper presents necessary and sufficient conditions for a strict local minimum of extended real-valued, nonsmooth functions defined on Riemannian manifolds.Keywords: Riemannian manifolds, nonsmooth optimization, lower semicontinuous functions, subdifferential
Procedia PDF Downloads 361684 Analysis of Slope in an Excavated Gneiss Rock Using Geological Strength Index (GSI) in Ilorin, Kwara State, Nigeria
Authors: S. A. Agbalajobi, W. A. Bello
Abstract:
The study carried out analysis on slope stability in an excavated gneiss rock using geological strength index (GSI) in Ilorin, Kwara State, Nigeria. A kinematic analysis of planar discontinuity sets in a gneiss deposit was carried out to ascertain the degree of slope stability. Discontinuity orientations in the rock mass were mapped using compass clinometers. The average result of physical and mechanical properties such as specific gravity, unit weight, uniaxial compressive strength, point load index, and Schmidt rebound value are 2.64 g/m3, 25.95 kN/m3, 156 MPa, 6.5 MPa, and 53.12 respectively. Also, a statistical model equation relating the rock strength was developed. The analyses states that the rock face is susceptible to wedge failures having all the geometrical conditions associated with the occurrence of such failures were noticeable. It can be concluded that analyses of discontinuity orientation in relation to cut face direction in rock excavation is essential for mine planning to forestall mine accidents. Assessment of excavated slope methods was evident that one excavation method (blasting and/or use of hydraulic hammer) is applicable for the given rock strength, the ease of excavation decreases as the rock mass quality increases, thus blasting most suitable for such operation.Keywords: slope stability, wedge failure, geological strength index (GSI), discontinuities and excavated slope
Procedia PDF Downloads 521683 Rethinking the Concept of Classroom Management during COVID-19 Times: An EFL Perspective
Authors: Hadjer Chellia
Abstract:
In the light of the recent global pandemic, different issues in educational research seem to invite careful considerations. Following this perspective, this study sets out to question the concept of classroom management in an EFL higher education context during Covid-19. In order to gain an in-depth understanding of their experiences, 6 EFL teachers from different Algerian universities took part in semi-structured interviews. The main emerging themes revealed that EFL teachers have different pedagogical practices in relation to classroom management during the global crisis than those of normal times. In relation to flexible education theory, the teachers’ experiences suggest flexible classroom management during Covid-19; flexibility in the teaching methods, approach and design, flexibility in time, flexibility in space and pace (speed), flexibility in assessment modes and flexibility in coping with students’ well-being. The flexibility awareness helps them to develop readiness towards the future, mainly in terms of maintaining an appropriate pedagogy to face the future crisis. In terms of theoretical concepts, working on classroom management under unusual circumstances in relation to flexible education helped come out with the concept of flexible classroom management (FCM) and virtual classroom management (VCM). It is then important for educators and researchers to rethink different pedagogical concepts and mind a careful application in the case of unusual times.Keywords: Covid-19, EFL educators, flexible classroom management, flexible education, virtual classroom management
Procedia PDF Downloads 164682 Investigation of the Effects of Processing Parameters on Pla Based 3D Printed Tensile Samples
Authors: Saifullah Karimullah
Abstract:
Additive manufacturing techniques are becoming more common with the latest technological advancements. It is composed to bring a revolution in the way products are designed, planned, manufactured, and distributed to end users. Fused deposition modeling (FDM) based 3D printing is one of those promising aspects that have revolutionized the prototyping processes. The purpose of this design and study project is to design a customized laboratory-scale FDM-based 3D printer from locally available sources. The primary goal is to design and fabricate the FDM-based 3D printer. After the fabrication, a tensile test specimen would be designed in Solid Works or [Creo computer-aided design (CAD)] software. A .stl file is generated of the tensile test specimen through slicing software and the G-codes are inserted via a computer for the test specimen to be printed. Different parameters were under studies like printing speed, layer thickness and infill density of the printed object. Some parameters were kept constant such as temperature, extrusion rate, raster orientation etc. Different tensile test specimens were printed for a different sets of parameters of the FDM-based 3d printer. The tensile test specimen were subjected to tensile tests using a universal testing machine (UTM). Design Expert software has been used for analyses, So Different results were obtained from the different tensile test specimens. The best, average and worst specimen were also observed under a compound microscope to investigate the layer bonding in between.Keywords: additive manufacturing techniques, 3D printing, CAD software, UTM machine
Procedia PDF Downloads 104681 Computational Studies of the Reactivity Descriptors and the Optoelectronic Properties on the Efficiency Free-Base- and Zn-Porphyrin-Sensitized Solar Cells
Authors: Soraya Abtouche, Zeyneb Ghoualem, Syrine Daoudi, Lina Ouldmohamed, Xavier Assfeld
Abstract:
This work reports density functional theory calculations of the optimized geometries, molecular reactivity, energy gap,and thermodynamic properties of the free base (H2P) and their Zn (II) metallated (ZnP), bearing one, two, or three carboxylic acid groups using the hybrid functional B3LYP, Cam-B3lYP, wb97xd with 6-31G(d,p) basis sets. When donating groups are attached to the molecular dye, the bond lengths are slightly decreased, which is important for the easy transfer of an electron from donating to the accepting group. For all dyes, the highest occupied molecular orbital/lowest occupied molecular orbital analysis results in positive outcomes upon electron injection to the semiconductor and subsequent dye regeneration by the electrolyte. The ionization potential increases with increasing conjugation; therefore, the compound dye attached to one carboxylic acid group has the highest ionization potential. The results show higher efficiencies of those sensitized with ZnP. These results have been explained, taking into account the electronic character of the metal ion, which acts as a mediator in the injection step, and, on the other hand, considering the number of anchoring groups to which it binds to the surface of TiO2.Keywords: DSSC, porphyrin, TD-DFT, electronic properties, donor-acceptor groups
Procedia PDF Downloads 79680 Retrospective Reconstruction of Time Series Data for Integrated Waste Management
Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy
Abstract:
The development, operation and maintenance of Integrated Waste Management Systems (IWMS) affects essentially the sustainable concern of every region. The features of such systems have great influence on all of the components of sustainability. In order to reach the optimal way of processes, a comprehensive mapping of the variables affecting the future efficiency of the system is needed such as analysis of the interconnections among the components and modelling of their interactions. The planning of a IWMS is based fundamentally on technical and economical opportunities and the legal framework. Modelling the sustainability and operation effectiveness of a certain IWMS is not in the scope of the present research. The complexity of the systems and the large number of the variables require the utilization of a complex approach to model the outcomes and future risks. This complex method should be able to evaluate the logical framework of the factors composing the system and the interconnections between them. The authors of this paper studied the usability of the Fuzzy Cognitive Map (FCM) approach modelling the future operation of IWMS’s. The approach requires two input data set. One is the connection matrix containing all the factors affecting the system in focus with all the interconnections. The other input data set is the time series, a retrospective reconstruction of the weights and roles of the factors. This paper introduces a novel method to develop time series by content analysis.Keywords: content analysis, factors, integrated waste management system, time series
Procedia PDF Downloads 329679 Scientific Linux Cluster for BIG-DATA Analysis (SLBD): A Case of Fayoum University
Authors: Hassan S. Hussein, Rania A. Abul Seoud, Amr M. Refaat
Abstract:
Scientific researchers face in the analysis of very large data sets that is increasing noticeable rate in today’s and tomorrow’s technologies. Hadoop and Spark are types of software that developed frameworks. Hadoop framework is suitable for many Different hardware platforms. In this research, a scientific Linux cluster for Big Data analysis (SLBD) is presented. SLBD runs open source software with large computational capacity and high performance cluster infrastructure. SLBD composed of one cluster contains identical, commodity-grade computers interconnected via a small LAN. SLBD consists of a fast switch and Gigabit-Ethernet card which connect four (nodes). Cloudera Manager is used to configure and manage an Apache Hadoop stack. Hadoop is a framework allows storing and processing big data across the cluster by using MapReduce algorithm. MapReduce algorithm divides the task into smaller tasks which to be assigned to the network nodes. Algorithm then collects the results and form the final result dataset. SLBD clustering system allows fast and efficient processing of large amount of data resulting from different applications. SLBD also provides high performance, high throughput, high availability, expandability and cluster scalability.Keywords: big data platforms, cloudera manager, Hadoop, MapReduce
Procedia PDF Downloads 361678 Student's Perception on the Relationship between Teacher's Supportive Teaching, Thwarting Teaching, Their Needed Satisfaction, Frustration, and Motivational Regulation at Vocational High School
Authors: Chi C. Lin, Chih. H. Hsieh, Chi H. Lin
Abstract:
The present study attempted to develop and test a self-determination theory dual-process model among teachers’ need supportive teaching, need thwarting teaching, and students’ need satisfaction, need frustration, and motivation regulation on vocational high school learners. This study adopted a survey questionnaire method. Participants were 736 (472 males, 264 females) vocational high school students in Taiwan. The instrument included five sets: the Teachers’ Need Supportive Teaching Scale, the Teachers’ Need Thwart Teaching Scale, the Need Satisfaction Scale, the Need Frustration Scale, and the Motivational Regulation Scale. A Structural equation modeling was used for the data analyses, results indicated that (1) teachers’ need supportive teaching had direct effects on students’ need satisfaction; (2) teachers’ thwarting teaching also had a direct effect on students’ need frustration; (3) teachers’ need supportive teaching had a negative direct effect on students’ need frustration; (4) students’ need satisfaction had direct effects on their autonomous motivation and control motivation, respectively; (5) students’ need frustration also had direct effects on their control motivation and motivation, respectively; (6) the model proposed in this study fit mostly with the empirical data.Keywords: motivational regulation, need satisfaction, need frustration, supportive teaching, thwart teaching, vocational high school students
Procedia PDF Downloads 136677 Culture as an Intervening Variable While Assessing Japanese Influence on Vietnam: 1991-2018
Authors: Teresa Mili
Abstract:
The significance of political and economic factors have barely been neglected while assessing bilateral relations, but the significance of culture as a soft power in Japan-Vietnam relations has largely been understated. While the close ties had their birth ever since the 14th century, this paper sets out with an inductive lens to analyze the role of culture as a variable in bilateral relations. Vietnam, which then had a history of war devastation had taken refuge in Japan and later sought inspiration from Japan’s economy with the simultaneous influence of culture since Japan was a developed nation, and Vietnam a third world country. Evidencing facts with illustrations, the paper shows how the twenty-first century has brought a growing bond as well as the onset of stronger ties between the two states based, primarily, on an emerging convergence of interests and culture. The cultural influence of Japan may be seen much in the Vietnamese cities, through evidences like the growing numbers of Japanese items on sale. The variety in cultural influence may be seen through the acceptance of Japanese fashion trends, mange comic, pop music, cuisine, tourism, Japanese studies and language, the translations of Japanese literature which are very much popular at Vietnam. Using secondary sources as well as assessing travel accounts and official websites, this research work will try to find out how much Japanese culture has influenced Vietnam and whether such influences will be strong enough to qualify culture as an intervening variable in the bilateral relations.Keywords: influence, culture, language, cold war
Procedia PDF Downloads 164676 Brain Tumor Segmentation Based on Minimum Spanning Tree
Authors: Simeon Mayala, Ida Herdlevær, Jonas Bull Haugsøen, Shamundeeswari Anandan, Sonia Gavasso, Morten Brun
Abstract:
In this paper, we propose a minimum spanning tree-based method for segmenting brain tumors. The proposed method performs interactive segmentation based on the minimum spanning tree without tuning parameters. The steps involve preprocessing, making a graph, constructing a minimum spanning tree, and a newly implemented way of interactively segmenting the region of interest. In the preprocessing step, a Gaussian filter is applied to 2D images to remove the noise. Then, the pixel neighbor graph is weighted by intensity differences and the corresponding minimum spanning tree is constructed. The image is loaded in an interactive window for segmenting the tumor. The region of interest and the background are selected by clicking to split the minimum spanning tree into two trees. One of these trees represents the region of interest and the other represents the background. Finally, the segmentation given by the two trees is visualized. The proposed method was tested by segmenting two different 2D brain T1-weighted magnetic resonance image data sets. The comparison between our results and the standard gold segmentation confirmed the validity of the minimum spanning tree approach. The proposed method is simple to implement and the results indicate that it is accurate and efficient.Keywords: brain tumor, brain tumor segmentation, minimum spanning tree, segmentation, image processing
Procedia PDF Downloads 122675 Pricing, Production and Inventory Policies Manufacturing under Stochastic Demand and Continuous Prices
Authors: Masoud Rabbani, Majede Smizadeh, Hamed Farrokhi-Asl
Abstract:
We study jointly determining prices and production in a multiple period horizon under a general non-stationary stochastic demand with continuous prices. In some periods we need to increase capacity of production to satisfy demand. This paper presents a model to aid multi-period production capacity planning by quantifying the trade-off between product quality and production cost. The product quality is estimated as the statistical variation from the target performances obtained from the output tolerances of the production machines that manufacture the components. We consider different tolerance for different machines that use to increase capacity. The production cost is estimated as the total cost of owning and operating a production facility during the planning horizon.so capacity planning has cost that impact on price. Pricing products often turns out to be difficult to measure them because customers have a reservation price to pay that impact on price and demand. We decide to determine prices and production for periods after enhance capacity and consider reservation price to determine price. First we use an algorithm base on fuzzy set of the optimal objective function values to determine capacity planning by determine maximize interval from upper bound in minimum objectives and define weight for objectives. Then we try to determine inventory and pricing policies. We can use a lemma to solve a problem in MATLAB and find exact answer.Keywords: price policy, inventory policy, capacity planning, product quality, epsilon -constraint
Procedia PDF Downloads 569674 A Comparative Soft Computing Approach to Supplier Performance Prediction Using GEP and ANN Models: An Automotive Case Study
Authors: Seyed Esmail Seyedi Bariran, Khairul Salleh Mohamed Sahari
Abstract:
In multi-echelon supply chain networks, optimal supplier selection significantly depends on the accuracy of suppliers’ performance prediction. Different methods of multi criteria decision making such as ANN, GA, Fuzzy, AHP, etc have been previously used to predict the supplier performance but the “black-box” characteristic of these methods is yet a major concern to be resolved. Therefore, the primary objective in this paper is to implement an artificial intelligence-based gene expression programming (GEP) model to compare the prediction accuracy with that of ANN. A full factorial design with %95 confidence interval is initially applied to determine the appropriate set of criteria for supplier performance evaluation. A test-train approach is then utilized for the ANN and GEP exclusively. The training results are used to find the optimal network architecture and the testing data will determine the prediction accuracy of each method based on measures of root mean square error (RMSE) and correlation coefficient (R2). The results of a case study conducted in Supplying Automotive Parts Co. (SAPCO) with more than 100 local and foreign supply chain members revealed that, in comparison with ANN, gene expression programming has a significant preference in predicting supplier performance by referring to the respective RMSE and R-squared values. Moreover, using GEP, a mathematical function was also derived to solve the issue of ANN black-box structure in modeling the performance prediction.Keywords: Supplier Performance Prediction, ANN, GEP, Automotive, SAPCO
Procedia PDF Downloads 421673 Land Cover Remote Sensing Classification Advanced Neural Networks Supervised Learning
Authors: Eiman Kattan
Abstract:
This study aims to evaluate the impact of classifying labelled remote sensing images conventional neural network (CNN) architecture, i.e., AlexNet on different land cover scenarios based on two remotely sensed datasets from different point of views such as the computational time and performance. Thus, a set of experiments were conducted to specify the effectiveness of the selected convolutional neural network using two implementing approaches, named fully trained and fine-tuned. For validation purposes, two remote sensing datasets, AID, and RSSCN7 which are publicly available and have different land covers features were used in the experiments. These datasets have a wide diversity of input data, number of classes, amount of labelled data, and texture patterns. A specifically designed interactive deep learning GPU training platform for image classification (Nvidia Digit) was employed in the experiments. It has shown efficiency in training, validation, and testing. As a result, the fully trained approach has achieved a trivial result for both of the two data sets, AID and RSSCN7 by 73.346% and 71.857% within 24 min, 1 sec and 8 min, 3 sec respectively. However, dramatic improvement of the classification performance using the fine-tuning approach has been recorded by 92.5% and 91% respectively within 24min, 44 secs and 8 min 41 sec respectively. The represented conclusion opens the opportunities for a better classification performance in various applications such as agriculture and crops remote sensing.Keywords: conventional neural network, remote sensing, land cover, land use
Procedia PDF Downloads 372672 Development of Fault Diagnosis Technology for Power System Based on Smart Meter
Authors: Chih-Chieh Yang, Chung-Neng Huang
Abstract:
In power system, how to improve the fault diagnosis technology of transmission line has always been the primary goal of power grid operators. In recent years, due to the rise of green energy, the addition of all kinds of distributed power also has an impact on the stability of the power system. Because the smart meters are with the function of data recording and bidirectional transmission, the adaptive Fuzzy Neural inference system, ANFIS, as well as the artificial intelligence that has the characteristics of learning and estimation in artificial intelligence. For transmission network, in order to avoid misjudgment of the fault type and location due to the input of these unstable power sources, combined with the above advantages of smart meter and ANFIS, a method for identifying fault types and location of faults is proposed in this study. In ANFIS training, the bus voltage and current information collected by smart meters can be trained through the ANFIS tool in MATLAB to generate fault codes to identify different types of faults and the location of faults. In addition, due to the uncertainty of distributed generation, a wind power system is added to the transmission network to verify the diagnosis correctness of the study. Simulation results show that the method proposed in this study can correctly identify the fault type and location of fault with more efficiency, and can deal with the interference caused by the addition of unstable power sources.Keywords: ANFIS, fault diagnosis, power system, smart meter
Procedia PDF Downloads 140671 The Impact of Motivation, Trust, and National Cultural Differences on Knowledge Sharing within the Context of Electronic Mail
Authors: Said Abdullah Al Saifi
Abstract:
The goal of this research is to examine the impact of trust, motivation, and national culture on knowledge sharing within the context of electronic mail. This study is quantitative and survey based. In order to conduct the research, 200 students from a leading university in New Zealand were chosen randomly to participate in a questionnaire survey. Motivation and trust were found to be significantly and positively related to knowledge sharing. The research findings illustrated that face saving, face gaining, and individualism positively moderates the relationship between motivation and knowledge sharing. In addition, collectivism culture negatively moderates the relationship between motivation and knowledge sharing. Moreover, the research findings reveal that face saving, individualism, and collectivism culture positively moderate the relationship between trust and knowledge sharing. In addition, face gaining culture negatively moderates the relationship between trust and knowledge sharing. This study sets out several implications for researchers and practitioners. The study produces an integrative model that shows how attributes of national culture impact knowledge sharing through the use of emails. A better understanding of the relationship between knowledge sharing and trust, motivation, and national culture differences will increase individuals’ ability to make wise choices when sharing knowledge with those from different cultures.Keywords: knowledge sharing, motivation, national culture, trust
Procedia PDF Downloads 348