Search results for: fuzzy set methods
15218 Teaching Academic Vocabulary: A Recent and Old Approach
Authors: Sara Fine-Meltzer
Abstract:
An obvious, but ill-addressed hindrance to reading comprehension in academic English is poor vocabulary. Unfortunately, dealing with the problem is usually delayed until university entrance. It is the contention of this paper that the chore should be confronted much earlier and by using a very old-fashioned method. This presentation is accompanied by vocabulary lists for advanced level university students with explanations concerning the content and justification for the 500-word lists: how they change over time in accordance with evolving styles of academic writing. There are also sample quizzes and methods to ensure that the words are “absorbed” over time. There is a discussion of other vocabulary acquisition methods and conclusions drawn from the drawbacks of such methods. The paper concludes with the rationale for beginning the study of “academic” vocabulary earlier than is generally acceptable.Keywords: academic vocabulary, old-fashioned methods, quizzes, vocabulary lists
Procedia PDF Downloads 12215217 Reading Literacy and Methods of Improving Reading
Authors: Iva Košek Bartošová, Andrea Jokešová, Eva Kozlová, Helena Matějová
Abstract:
The paper presents results of a research team from Faculty of Education, University of Hradec Králové in the Czech Republic. It introduces with the most reading methods used in the 1st classes of a primary school and presents results of a pilot research focused on mastering reading techniques and the quality of reading comprehension of pupils in the first half of a school year during training in teaching reading by an analytic-synthetic method and by a genetic method. These methods of practicing reading skills are the most used ones in the Czech Republic. During the school year 2015/16 there has been a measurement made of two groups of pupils of the 1st year and monitoring of quantitative and qualitative parameters of reading pupils’ outputs by several methods. Both of these methods are based on different theoretical basis and each of them has a specific educational and methodical procedure. This contribution represents results during a piloting project and draws pilot conclusions which will be verified in the subsequent broader research at the end of the school year of the first class of primary school.Keywords: analytic-synthetic method of reading, genetic method of reading, reading comprehension, reading literacy, reading methods, reading speed
Procedia PDF Downloads 25915216 Parallel Multisplitting Methods for Differential Systems
Authors: Malika El Kyal, Ahmed Machmoum
Abstract:
We prove the superlinear convergence of asynchronous multi-splitting methods applied to differential equations. This study is based on the technique of nested sets. It permits to specify kind of the convergence in the asynchronous mode.The main characteristic of an asynchronous mode is that the local algorithm not have to wait at predetermined messages to become available. We allow some processors to communicate more frequently than others, and we allow the communication delays to be substantial and unpredictable. Note that synchronous algorithms in the computer science sense are particular cases of our formulation of asynchronous one.Keywords: parallel methods, asynchronous mode, multisplitting, ODE
Procedia PDF Downloads 52615215 Evaluation of Wheat Sowing and Fertilizer Application Methods in Wheat Weeds Management
Authors: Ebrahim Izadi-Darbandi
Abstract:
In order to investigation the effects of sowing methods, nitrogen and phosphorus application methods in wheat weeds management, an experiment was performed as split plot, based on randomized completely block design with three replications at Research Farm, Faculty of Agriculture, Ferdowsi University of Mashhad, in 2010. Treatments included, wheat sowing methods (single-row with 30 cm distance and twine row on 50 cm width ridges) as main plots and nitrogen and phosphorus application methods (Broadcast and Band) as sub plots. In this experiment, phosphorus and nitrogen sources for fertilization were super phosphate triple (150 kg ha-1) applied before wheat sowing and incorporated with soil and urea (200 kg ha-1) respectively, applied in 2 phases (pre-plant 50%) and near wheat shooting (50%). Results showed that the effect of fertilizers application methods and wheat sowing methods were significant (p≤0.01) on wheat yield increasing and reducing weed-wheat competition. Wheat twine row sowing method, reduced weeds biomass for 25% compared wheat single-row sowing method and increased wheat seed yield and biomass for 60% and 30% respectively. Phosphorus and nitrogen band application reduced weeds biomass for 46% and 53% respectively and increased wheat seed yield for 22% and 33% compared to their broadcast application. The effects of wheat sowing method plus phosphorus and nitrogen application methods interactions, showed that the fertilizers band application and wheat twine-row sowing method were the best methods in wheat yield improvement and reducing wheat-weeds interaction. These results shows that modifying of fertilization methods and wheat sowing method can have important role in fertilizers use efficiency and improving of weeds managements.Keywords: competition, wheat yield, fertilizer management, biomass
Procedia PDF Downloads 36815214 Structuring Highly Iterative Product Development Projects by Using Agile-Indicators
Authors: Guenther Schuh, Michael Riesener, Frederic Diels
Abstract:
Nowadays, manufacturing companies are faced with the challenge of meeting heterogeneous customer requirements in short product life cycles with a variety of product functions. So far, some of the functional requirements remain unknown until late stages of the product development. A way to handle these uncertainties is the highly iterative product development (HIP) approach. By structuring the development project as a highly iterative process, this method provides customer oriented and marketable products. There are first approaches for combined, hybrid models comprising deterministic-normative methods like the Stage-Gate process and empirical-adaptive development methods like SCRUM on a project management level. However, almost unconsidered is the question, which development scopes can preferably be realized with either empirical-adaptive or deterministic-normative approaches. In this context, a development scope constitutes a self-contained section of the overall development objective. Therefore, this paper focuses on a methodology that deals with the uncertainty of requirements within the early development stages and the corresponding selection of the most appropriate development approach. For this purpose, internal influencing factors like a company’s technology ability, the prototype manufacturability and the potential solution space as well as external factors like the market accuracy, relevance and volatility will be analyzed and combined into an Agile-Indicator. The Agile-Indicator is derived in three steps. First of all, it is necessary to rate each internal and external factor in terms of the importance for the overall development task. Secondly, each requirement has to be evaluated for every single internal and external factor appropriate to their suitability for empirical-adaptive development. Finally, the total sums of internal and external side are composed in the Agile-Indicator. Thus, the Agile-Indicator constitutes a company-specific and application-related criterion, on which the allocation of empirical-adaptive and deterministic-normative development scopes can be made. In a last step, this indicator will be used for a specific clustering of development scopes by application of the fuzzy c-means (FCM) clustering algorithm. The FCM-method determines sub-clusters within functional clusters based on the empirical-adaptive environmental impact of the Agile-Indicator. By means of the methodology presented in this paper, it is possible to classify requirements, which are uncertainly carried out by the market, into empirical-adaptive or deterministic-normative development scopes.Keywords: agile, highly iterative development, agile-indicator, product development
Procedia PDF Downloads 24615213 The 'Ineffectiveness' of Teaching Research Methods in Moroccan Higher Education: A Qualitative Study
Authors: Ahmed Chouari
Abstract:
Although research methods has been an integral part of the curriculum in Moroccan higher education for decades, it seems that the research methods teaching pedagogy that teachers use suffers from a serious absence of a body of literature in the field. Also, the various challenges that both teachers and students of research methods face have received little interest by researchers in comparison to other fields such as applied linguistics. Therefore, the main aim of this study is to remedy to this situation by exploring one of the major issues in teaching research methods – that is, the phenomenon of students’ dissatisfaction with the research methods course in higher education in Morocco. The aim is also to understand students’ attitudes and perceptions on how to make the research methods course more effective in the future. Three qualitative research questions were used: (1) To what extent are graduate students satisfied with the pedagogies used by the teachers of the research methods course in Moroccan higher education? (2) To what extent are graduate students satisfied with the approach used in assessing research methods in Moroccan higher education? (3) What are students’ perceptions on how to make the research methods course more effective in Moroccan higher education? In this study, a qualitative content analysis was adopted to analyze students’ views and perspectives about the major factors behind their dissatisfaction with the course at the School of Arts and Humanities – University of Moulay Ismail. A semi-structured interview was used to collect data from 14 respondents from two different Master programs. The results show that there is a general consensus among the respondents about the major factors behind the ineffectiveness of the course. These factors include theory-practice gap, heavy reliance on theoretical knowledge at the expense of procedural knowledge, and ineffectiveness of some teachers. The findings also reveal that teaching research methods in Morocco requires more time, better equipment, and more competent teachers. Above all, the findings indicate that today there is an urgent need in Morocco to shift from teacher-centered approaches to learner-centered approaches in teaching the research methods course. These findings, thus, contribute to the existing literature by unraveling the factors that impede the learning process, and by suggesting a set of strategies that can make course more effective.Keywords: competencies, learner-centered teaching, research methods, student autonomy, pedagogy
Procedia PDF Downloads 26515212 The Clustering of Multiple Sclerosis Subgroups through L2 Norm Multifractal Denoising Technique
Authors: Yeliz Karaca, Rana Karabudak
Abstract:
Multifractal Denoising techniques are used in the identification of significant attributes by removing the noise of the dataset. Magnetic resonance (MR) image technique is the most sensitive method so as to identify chronic disorders of the nervous system such as Multiple Sclerosis. MRI and Expanded Disability Status Scale (EDSS) data belonging to 120 individuals who have one of the subgroups of MS (Relapsing Remitting MS (RRMS), Secondary Progressive MS (SPMS), Primary Progressive MS (PPMS)) as well as 19 healthy individuals in the control group have been used in this study. The study is comprised of the following stages: (i) L2 Norm Multifractal Denoising technique, one of the multifractal technique, has been used with the application on the MS data (MRI and EDSS). In this way, the new dataset has been obtained. (ii) The new MS dataset obtained from the MS dataset and L2 Multifractal Denoising technique has been applied to the K-Means and Fuzzy C Means clustering algorithms which are among the unsupervised methods. Thus, the clustering performances have been compared. (iii) In the identification of significant attributes in the MS dataset through the Multifractal denoising (L2 Norm) technique using K-Means and FCM algorithms on the MS subgroups and control group of healthy individuals, excellent performance outcome has been yielded. According to the clustering results based on the MS subgroups obtained in the study, successful clustering results have been obtained in the K-Means and FCM algorithms by applying the L2 norm of multifractal denoising technique for the MS dataset. Clustering performance has been more successful with the MS Dataset (L2_Norm MS Data Set) K-Means and FCM in which significant attributes are obtained by applying L2 Norm Denoising technique.Keywords: clinical decision support, clustering algorithms, multiple sclerosis, multifractal techniques
Procedia PDF Downloads 16815211 EcoLife and Greed Index Measurement: An Alternative Tool to Promote Sustainable Communities and Eco-Justice
Authors: Louk Aourelien Andrianos, Edward Dommen, Athena Peralta
Abstract:
Greed, as epitomized by overconsumption of natural resources, is at the root of ecological destruction and unsustainability of modern societies. Presently economies rely on unrestricted structural greed which fuels unlimited economic growth, overconsumption, and individualistic competitive behavior. Structural greed undermines the life support system on earth and threatens ecological integrity, social justice and peace. The World Council of Churches (WCC) has developed a program on ecological and economic justice (EEJ) with the aim to promote an economy of life where the economy is embedded in society and society in ecology. This paper aims at analyzing and assessing the economy of life (EcoLife) by offering an empirical tool to measure and monitor the root causes and effects of unsustainability resulting from human greed on global, national, institutional and individual levels. This holistic approach is based on the integrity of ecology and economy in a society founded on justice. The paper will discuss critical questions such as ‘what is an economy of life’ and ‘how to measure and control it from the effect of greed’. A model called GLIMS, which stands for Greed Lines and Indices Measurement System is used to clarify the concept of greed and help measuring the economy of life index by fuzzy logic reasoning. The inputs of the model are from statistical indicators of natural resources consumption, financial realities, economic performance, social welfare and ethical and political facts. The outputs are concrete measures of three primary indices of ecological, economic and socio-political greed (ECOL-GI, ECON-GI, SOCI-GI) and one overall multidimensional economy of life index (EcoLife-I). EcoLife measurement aims to build awareness of an economy life and to address the effects of greed in systemic and structural aspects. It is a tool for ethical diagnosis and policy making.Keywords: greed line, sustainability indicators, fuzzy logic, eco-justice, World Council of Churches (WCC)
Procedia PDF Downloads 32015210 Lean Product Development and Sustainability: A Systematic Literature Review
Authors: João P. E. De Souza, Rob Dekkers
Abstract:
Whereas lean product development aims at maximising customer value whilst optimising product and process design, the question arises whether this approach includes sustainability. A systematic literature review reveals that methods associated with this conceptualisation of product development are suitable for including sustainability, but that the criteria for the triple-bottom line need to be included when using these methods; this is particularly the case for social aspects. Thus, the main finding is that not new methods should be developed, but that existing methods should be more inclusive towards all aspects of sustainability and product life-cycle thinking.Keywords: lean product development, product life-cycle, sustainability, systematic literature review, triple bottom-line
Procedia PDF Downloads 16615209 Current Methods for Drug Property Prediction in the Real World
Authors: Jacob Green, Cecilia Cabrera, Maximilian Jakobs, Andrea Dimitracopoulos, Mark van der Wilk, Ryan Greenhalgh
Abstract:
Predicting drug properties is key in drug discovery to enable de-risking of assets before expensive clinical trials and to find highly active compounds faster. Interest from the machine learning community has led to the release of a variety of benchmark datasets and proposed methods. However, it remains unclear for practitioners which method or approach is most suitable, as different papers benchmark on different datasets and methods, leading to varying conclusions that are not easily compared. Our large-scale empirical study links together numerous earlier works on different datasets and methods, thus offering a comprehensive overview of the existing property classes, datasets, and their interactions with different methods. We emphasise the importance of uncertainty quantification and the time and, therefore, cost of applying these methods in the drug development decision-making cycle. To the best of the author's knowledge, it has been observed that the optimal approach varies depending on the dataset and that engineered features with classical machine learning methods often outperform deep learning. Specifically, QSAR datasets are typically best analysed with classical methods such as Gaussian Processes, while ADMET datasets are sometimes better described by Trees or deep learning methods such as Graph Neural Networks or language models. Our work highlights that practitioners do not yet have a straightforward, black-box procedure to rely on and sets a precedent for creating practitioner-relevant benchmarks. Deep learning approaches must be proven on these benchmarks to become the practical method of choice in drug property prediction.Keywords: activity (QSAR), ADMET, classical methods, drug property prediction, empirical study, machine learning
Procedia PDF Downloads 8115208 Project and Module Based Teaching and Learning
Authors: Jingyu Hou
Abstract:
This paper proposes a new teaching and learning approach-project and Module Based Teaching and Learning (PMBTL). The PMBTL approach incorporates the merits of project/problem based and module based learning methods, and overcomes the limitations of these methods. The correlation between teaching, learning, practice, and assessment is emphasized in this approach, and new methods have been proposed accordingly. The distinct features of these new methods differentiate the PMBTL approach from conventional teaching approaches. Evaluation of this approach on practical teaching and learning activities demonstrates the effectiveness and stability of the approach in improving the performance and quality of teaching and learning. The approach proposed in this paper is also intuitive to the design of other teaching units.Keywords: computer science education, project and module based, software engineering, module based teaching and learning
Procedia PDF Downloads 49315207 Design of Digital IIR Filter Using Opposition Learning and Artificial Bee Colony Algorithm
Authors: J. S. Dhillon, K. K. Dhaliwal
Abstract:
In almost all the digital filtering applications the digital infinite impulse response (IIR) filters are preferred over finite impulse response (FIR) filters because they provide much better performance, less computational cost and have smaller memory requirements for similar magnitude specifications. However, the digital IIR filters are generally multimodal with respect to the filter coefficients and therefore, reliable methods that can provide global optimal solutions are required. The artificial bee colony (ABC) algorithm is one such recently introduced meta-heuristic optimization algorithm. But in some cases it shows insufficiency while searching the solution space resulting in a weak exchange of information and hence is not able to return better solutions. To overcome this deficiency, the opposition based learning strategy is incorporated in ABC and hence a modified version called oppositional artificial bee colony (OABC) algorithm is proposed in this paper. Duplication of members is avoided during the run which also augments the exploration ability. The developed algorithm is then applied for the design of optimal and stable digital IIR filter structure where design of low-pass (LP) and high-pass (HP) filters is carried out. Fuzzy theory is applied to achieve maximize satisfaction of minimum magnitude error and stability constraints. To check the effectiveness of OABC, the results are compared with some well established filter design techniques and it is observed that in most cases OABC returns better or atleast comparable results.Keywords: digital infinite impulse response filter, artificial bee colony optimization, opposition based learning, digital filter design, multi-parameter optimization
Procedia PDF Downloads 47715206 Impacts on Marine Ecosystems Using a Multilayer Network Approach
Authors: Nelson F. F. Ebecken, Gilberto C. Pereira, Lucio P. de Andrade
Abstract:
Bays, estuaries and coastal ecosystems are some of the most used and threatened natural systems globally. Its deterioration is due to intense and increasing human activities. This paper aims to monitor the socio-ecological in Brazil, model and simulate it through a multilayer network representing a DPSIR structure (Drivers, Pressures, States-Impacts-Responses) considering the concept of Management based on Ecosystems to support decision-making under the National/State/Municipal Coastal Management policy. This approach considers several interferences and can represent a significant advance in several scientific aspects. The main objective of this paper is the coupling of three different types of complex networks, the first being an ecological network, the second a social network, and the third a network of economic activities, in order to model the marine ecosystem. Multilayer networks comprise two or more "layers", which may represent different types of interactions, different communities, different points in time, and so on. The dependency between layers results from processes that affect the various layers. For example, the dispersion of individuals between two patches affects the network structure of both samples. A multilayer network consists of (i) a set of physical nodes representing entities (e.g., species, people, companies); (ii) a set of layers, which may include multiple layering aspects (e.g., time dependency and multiple types of relationships); (iii) a set of state nodes, each of which corresponds to the manifestation of a given physical node in a layer-specific; and (iv) a set of edges (weighted or not) to connect the state nodes among themselves. The edge set includes the intralayer edges familiar and interlayer ones, which connect state nodes between layers. The applied methodology in an existent case uses the Flow cytometry process and the modeling of ecological relationships (trophic and non-trophic) following fuzzy theory concepts and graph visualization. The identification of subnetworks in the fuzzy graphs is carried out using a specific computational method. This methodology allows considering the influence of different factors and helps their contributions to the decision-making process.Keywords: marine ecosystems, complex systems, multilayer network, ecosystems management
Procedia PDF Downloads 11315205 Cloud-Based Dynamic Routing with Feedback in Formal Methods
Authors: Jawid Ahmad Baktash, Mursal Dawodi, Tomokazu Nagata
Abstract:
With the rapid growth of Cloud Computing, Formal Methods became a good choice for the refinement of message specification and verification for Dynamic Routing in Cloud Computing. Cloud-based Dynamic Routing is becoming increasingly popular. We propose feedback in Formal Methods for Dynamic Routing and Cloud Computing; the model and topologies show how to send messages from index zero to all others formally. The responsibility of proper verification becomes crucial with Dynamic Routing in the cloud. Formal Methods can play an essential role in the routing and development of Networks, and the testing of distributed systems. Event-B is a formal technique that consists of describing the problem rigorously and introduces solutions or details in the refinement steps. Event-B is a variant of B, designed for developing distributed systems and message passing of the dynamic routing. In Event-B and formal methods, the events consist of guarded actions occurring spontaneously rather than being invoked.Keywords: cloud, dynamic routing, formal method, Pro-B, event-B
Procedia PDF Downloads 42315204 An Information-Based Approach for Preference Method in Multi-Attribute Decision Making
Authors: Serhat Tuzun, Tufan Demirel
Abstract:
Multi-Criteria Decision Making (MCDM) is the modelling of real-life to solve problems we encounter. It is a discipline that aids decision makers who are faced with conflicting alternatives to make an optimal decision. MCDM problems can be classified into two main categories: Multi-Attribute Decision Making (MADM) and Multi-Objective Decision Making (MODM), based on the different purposes and different data types. Although various MADM techniques were developed for the problems encountered, their methodology is limited in modelling real-life. Moreover, objective results are hard to obtain, and the findings are generally derived from subjective data. Although, new and modified techniques are developed by presenting new approaches such as fuzzy logic; comprehensive techniques, even though they are better in modelling real-life, could not find a place in real world applications for being hard to apply due to its complex structure. These constraints restrict the development of MADM. This study aims to conduct a comprehensive analysis of preference methods in MADM and propose an approach based on information. For this purpose, a detailed literature review has been conducted, current approaches with their advantages and disadvantages have been analyzed. Then, the approach has been introduced. In this approach, performance values of the criteria are calculated in two steps: first by determining the distribution of each attribute and standardizing them, then calculating the information of each attribute as informational energy.Keywords: literature review, multi-attribute decision making, operations research, preference method, informational energy
Procedia PDF Downloads 22415203 Component Comparison of Polyaluminum Chloride Produced from Various Methods
Authors: Wen Po Cheng, Chia Yun Chung, Ruey Fang Yu, Chao Feng Chen
Abstract:
The main objective of this research was to study the differences of aluminum hydrolytic products between two PACl preparation methods. These two methods were the acidification process of freshly formed amorphous Al(OH)3 and the conventional alkalization process of aluminum chloride solution. According to Ferron test and 27Al NMR analysis of those two PACl preparation procedures, the reaction rate constant (k) values and Al13 percentage of acid addition process at high basicity value were both lower than those values of the alkaline addition process. The results showed that the molecular structure and size distribution of the aluminum species in both preparing methods were suspected to be significantly different at high basicity value.Keywords: polyaluminum chloride, Al13, amorphous aluminum hydroxide, Ferron test
Procedia PDF Downloads 37615202 Ischemic Stroke Detection in Computed Tomography Examinations
Authors: Allan F. F. Alves, Fernando A. Bacchim Neto, Guilherme Giacomini, Marcela de Oliveira, Ana L. M. Pavan, Maria E. D. Rosa, Diana R. Pina
Abstract:
Stroke is a worldwide concern, only in Brazil it accounts for 10% of all registered deaths. There are 2 stroke types, ischemic (87%) and hemorrhagic (13%). Early diagnosis is essential to avoid irreversible cerebral damage. Non-enhanced computed tomography (NECT) is one of the main diagnostic techniques used due to its wide availability and rapid diagnosis. Detection depends on the size and severity of lesions and the time spent between the first symptoms and examination. The Alberta Stroke Program Early CT Score (ASPECTS) is a subjective method that increases the detection rate. The aim of this work was to implement an image segmentation system to enhance ischemic stroke and to quantify the area of ischemic and hemorrhagic stroke lesions in CT scans. We evaluated 10 patients with NECT examinations diagnosed with ischemic stroke. Analyzes were performed in two axial slices, one at the level of the thalamus and basal ganglion and one adjacent to the top edge of the ganglionic structures with window width between 80 and 100 Hounsfield Units. We used different image processing techniques such as morphological filters, discrete wavelet transform and Fuzzy C-means clustering. Subjective analyzes were performed by a neuroradiologist according to the ASPECTS scale to quantify ischemic areas in the middle cerebral artery region. These subjective analysis results were compared with objective analyzes performed by the computational algorithm. Preliminary results indicate that the morphological filters actually improve the ischemic areas for subjective evaluations. The comparison in area of the ischemic region contoured by the neuroradiologist and the defined area by computational algorithm showed no deviations greater than 12% in any of the 10 examination tests. Although there is a tendency that the areas contoured by the neuroradiologist are smaller than those obtained by the algorithm. These results show the importance of a computer aided diagnosis software to assist neuroradiology decisions, especially in critical situations as the choice of treatment for ischemic stroke.Keywords: ischemic stroke, image processing, CT scans, Fuzzy C-means
Procedia PDF Downloads 36615201 Comfort Sensor Using Fuzzy Logic and Arduino
Authors: Samuel John, S. Sharanya
Abstract:
Automation has become an important part of our life. It has been used to control home entertainment systems, changing the ambience of rooms for different events etc. One of the main parameters to control in a smart home is the atmospheric comfort. Atmospheric comfort mainly includes temperature and relative humidity. In homes, the desired temperature of different rooms varies from 20 °C to 25 °C and relative humidity is around 50%. However, it varies widely. Hence, automated measurement of these parameters to ensure comfort assumes significance. To achieve this, a fuzzy logic controller using Arduino was developed using MATLAB. Arduino is an open source hardware consisting of a 24 pin ATMEGA chip (atmega328), 14 digital input /output pins and an inbuilt ADC. It runs on 5v and 3.3v power supported by a board voltage regulator. Some of the digital pins in Aruduino provide PWM (pulse width modulation) signals, which can be used in different applications. The Arduino platform provides an integrated development environment, which includes support for c, c++ and java programming languages. In the present work, soft sensor was introduced in this system that can indirectly measure temperature and humidity and can be used for processing several measurements these to ensure comfort. The Sugeno method (output variables are functions or singleton/constant, more suitable for implementing on microcontrollers) was used in the soft sensor in MATLAB and then interfaced to the Arduino, which is again interfaced to the temperature and humidity sensor DHT11. The temperature-humidity sensor DHT11 acts as the sensing element in this system. Further, a capacitive humidity sensor and a thermistor were also used to support the measurement of temperature and relative humidity of the surrounding to provide a digital signal on the data pin. The comfort sensor developed was able to measure temperature and relative humidity correctly. The comfort percentage was calculated and accordingly the temperature in the room was controlled. This system was placed in different rooms of the house to ensure that it modifies the comfort values depending on temperature and relative humidity of the environment. Compared to the existing comfort control sensors, this system was found to provide an accurate comfort percentage. Depending on the comfort percentage, the air conditioners and the coolers in the room were controlled. The main highlight of the project is its cost efficiency.Keywords: arduino, DHT11, soft sensor, sugeno
Procedia PDF Downloads 31215200 Improving Short-Term Forecast of Solar Irradiance
Authors: Kwa-Sur Tam, Byung O. Kang
Abstract:
By using different ranges of daily sky clearness index defined in this paper, any day can be classified as a clear sky day, a partly cloudy day or a cloudy day. This paper demonstrates how short-term forecasting of solar irradiation can be improved by taking into consideration the type of day so defined. The source of day type dependency has been identified. Forecasting methods that take into consideration of day type have been developed and their efficacy have been established. While all methods that implement some form of adjustment to the cloud cover forecast provided by the U.S. National Weather Service provide accuracy improvement, methods that incorporate day type dependency provides even further improvement in forecast accuracy.Keywords: day types, forecast methods, National Weather Service, sky cover, solar energy
Procedia PDF Downloads 46615199 Schedule a New Production Plan by Heuristic Methods
Authors: Hanife Merve Öztürk, Sıdıka Dalgan
Abstract:
In this project, a capacity analysis study is done at TAT A. Ş. Maret Plant. Production capacity of products which generate 80% of sales amount are determined. Obtained data entered the LEKIN Scheduling Program and we get production schedules by using heuristic methods. Besides heuristic methods, as mathematical model, disjunctive programming formulation is adapted to flexible job shop problems by adding a new constraint to find optimal schedule solution.Keywords: scheduling, flexible job shop problem, shifting bottleneck heuristic, mathematical modelling
Procedia PDF Downloads 40115198 Comparison of Soil Test Extractants for Determination of Available Soil Phosphorus
Authors: Violina Angelova, Stefan Krustev
Abstract:
The aim of this work was to evaluate the effectiveness of different soil test extractants for the determination of available soil phosphorus in five internationally certified standard soils, sludge and clay (NCS DC 85104, NCS DC 85106, ISE 859, ISE 952, ISE 998). The certified samples were extracted with the following methods/extractants: CaCl₂, CaCl₂ and DTPA (CAT), double lactate (DL), ammonium lactate (AL), calcium acetate lactate (CAL), Olsen, Mehlich 3, Bray and Kurtz I, and Morgan, which are commonly used in soil testing laboratories. The phosphorus in soil extracts was measured colorimetrically using Spectroquant Pharo 100 spectrometer. The methods used in the study were evaluated according to the recovery of available phosphorus, facility of application and rapidity of performance. The relationships between methods are examined statistically. A good agreement of the results from different soil test was established for all certified samples. In general, the P values extracted by the nine extraction methods significantly correlated with each other. When grouping the soils according to pH, organic carbon content and clay content, weaker extraction methods showed analogous trends; also among the stronger extraction methods, common tendencies were found. Other factors influencing the extraction force of the different methods include soil: solution ratio, as well as the duration and power of shaking the samples. The mean extractable P in certified samples was found to be in the order of CaCl₂ < CAT < Morgan < Bray and Kurtz I < Olsen < CAL < DL < Mehlich 3 < AL. Although the nine methods extracted different amounts of P from the certified samples, values of P extracted by the different methods were strongly correlated among themselves. Acknowledgment: The financial support by the Bulgarian National Science Fund Projects DFNI Н04/9 and DFNI Н06/21 are greatly appreciated.Keywords: available soil phosphorus, certified samples, determination, soil test extractants
Procedia PDF Downloads 15115197 Strategic Management Methods in Non-Profit Making Organization
Authors: P. Řehoř, D. Holátová, V. Doležalová
Abstract:
Paper deals with analysis of strategic management methods in non-profit making organization in the Czech Republic. Strategic management represents an aggregate of methods and approaches that can be applied for managing organizations - in this article the organizations which associate owners and keepers of non-state forest properties. Authors use these methods of strategic management: analysis of stakeholders, SWOT analysis and questionnaire inquiries. The questionnaire was distributed electronically via e-mail. In October 2013 we obtained data from a total of 84 questionnaires. Based on the results the authors recommend the using of confrontation strategy which improves the competitiveness of non-profit making organizations.Keywords: strategic management, non-profit making organization, strategy analysis, SWOT analysis, strategy, competitiveness
Procedia PDF Downloads 48315196 Methods Used to Perform Requirements Elicitation for Healthcare Software Development
Authors: Tang Jiacheng, Fang Tianyu, Liu Yicen, Xiang Xingzhou
Abstract:
The proportion of healthcare services is increasing throughout the globe. The convergence of mobile technology is driving new business opportunities, innovations in healthcare service delivery and the promise of a better life tomorrow for different populations with various healthcare needs. One of the most important phases for the combination of health care and mobile applications is to elicit requirements correctly. In this paper, four articles from different research directions with four topics on healthcare were detailed analyzed and summarized. We identified the underlying problems in guidance to develop mobile applications to provide healthcare service for Older adults, Women in menopause, Patients undergoing covid. These case studies cover several elicitation methods: survey, prototyping, focus group interview and questionnaire. And the effectiveness of these methods was analyzed along with the advantages and limitations of these methods, which is beneficial to adapt the elicitation methods for future software development process.Keywords: healthcare, software requirement elicitation, mobile applications, prototyping, focus group interview
Procedia PDF Downloads 14815195 Development of a Decision Model to Optimize Total Cost in Food Supply Chain
Authors: Henry Lau, Dilupa Nakandala, Li Zhao
Abstract:
All along the length of the supply chain, fresh food firms face the challenge of managing both product quality, due to the perishable nature of the products, and product cost. This paper develops a method to assist logistics managers upstream in the fresh food supply chain in making cost optimized decisions regarding transportation, with the objective of minimizing the total cost while maintaining the quality of food products above acceptable levels. Considering the case of multiple fresh food products collected from multiple farms being transported to a warehouse or a retailer, this study develops a total cost model that includes various costs incurred during transportation. The practical application of the model is illustrated by using several computational intelligence approaches including Genetic Algorithms (GA), Fuzzy Genetic Algorithms (FGA) as well as an improved Simulated Annealing (SA) procedure applied with a repair mechanism for efficiency benchmarking. We demonstrate the practical viability of these approaches by using a simulation study based on pertinent data and evaluate the simulation outcomes. The application of the proposed total cost model was demonstrated using three approaches of GA, FGA and SA with a repair mechanism. All three approaches are adoptable; however, based on the performance evaluation, it was evident that the FGA is more likely to produce a better performance than the other two approaches of GA and SA. This study provides a pragmatic approach for supporting logistics and supply chain practitioners in fresh food industry in making important decisions on the arrangements and procedures related to the transportation of multiple fresh food products to a warehouse from multiple farms in a cost-effective way without compromising product quality. This study extends the literature on cold supply chain management by investigating cost and quality optimization in a multi-product scenario from farms to a retailer and, minimizing cost by managing the quality above expected quality levels at delivery. The scalability of the proposed generic function enables the application to alternative situations in practice such as different storage environments and transportation conditions.Keywords: cost optimization, food supply chain, fuzzy sets, genetic algorithms, product quality, transportation
Procedia PDF Downloads 22315194 Convergence of Sinc Methods Applied to Kuramoto-Sivashinsky Equation
Authors: Kamel Al-Khaled
Abstract:
A comparative study of the Sinc-Galerkin and Sinc-Collocation methods for solving the Kuramoto-Sivashinsky equation is given. Both approaches depend on using Sinc basis functions. Firstly, a numerical scheme using Sinc-Galerkin method is developed to approximate the solution of Kuramoto-Sivashinsky equation. Sinc approximations to both derivatives and indefinite integrals reduces the solution to an explicit system of algebraic equations. The error in the solution is shown to converge to the exact solution at an exponential. The convergence proof of the solution for the discrete system is given using fixed-point iteration. Secondly, a combination of a Crank-Nicolson formula in the time direction, with the Sinc-collocation in the space direction is presented, where the derivatives in the space variable are replaced by the necessary matrices to produce a system of algebraic equations. The methods are tested on two examples. The demonstrated results show that both of the presented methods more or less have the same accuracy.Keywords: Sinc-Collocation, nonlinear PDEs, numerical methods, fixed-point
Procedia PDF Downloads 47115193 Assessment of Slope Stability by Continuum and Discontinuum Methods
Authors: Taleb Hosni Abderrahmane, Berga Abdelmadjid
Abstract:
The development of numerical analysis and its application to geomechanics problems have provided geotechnical engineers with extremely powerful tools. One of the most important problems in geotechnical engineering is the slope stability assessment. It is a very difficult task due to several aspects such the nature of the problem, experimental consideration, monitoring, controlling, and assessment. The main objective of this paper is to perform a comparative numerical study between the following methods: The Limit Equilibrium (LEM), Finite Element (FEM), Limit Analysis (LAM) and Distinct Element (DEM). The comparison is conducted in terms of the safety factors and the critical slip surfaces. Through the results, we see the feasibility to analyse slope stability by many methods.Keywords: comparison, factor of safety, geomechanics, numerical methods, slope analysis, slip surfaces
Procedia PDF Downloads 53315192 Reliability Analysis of Steel Columns under Buckling Load in Second-Order Theory
Authors: Hamed Abshari, M. Reza Emami Azadi, Madjid Sadegh Azar
Abstract:
For studying the overall instability of members of steel structures, there are several methods in which overall buckling and geometrical imperfection effects are considered in analysis. In first section, these methods are compared and ability of software to apply these methods is studied. Buckling loads determined from theoretical methods and software is compared for 2D one bay, one and two stories steel frames. To consider actual condition, buckling loads of three steel frames that have various dimensions are calculated and compared. Also, uncertainties that exist in loading and modeling of structures such as geometrical imperfection, yield stress, and modulus of elasticity in buckling load of 2D framed steel structures have been studied. By performing these uncertainties to each reliability analysis procedures (first-order, second-order, and simulation methods of reliability), one index of reliability from each procedure is determined. These values are studied and compared.Keywords: buckling, second-order theory, reliability index, steel columns
Procedia PDF Downloads 49215191 Feature Selection of Personal Authentication Based on EEG Signal for K-Means Cluster Analysis Using Silhouettes Score
Authors: Jianfeng Hu
Abstract:
Personal authentication based on electroencephalography (EEG) signals is one of the important field for the biometric technology. More and more researchers have used EEG signals as data source for biometric. However, there are some disadvantages for biometrics based on EEG signals. The proposed method employs entropy measures for feature extraction from EEG signals. Four type of entropies measures, sample entropy (SE), fuzzy entropy (FE), approximate entropy (AE) and spectral entropy (PE), were deployed as feature set. In a silhouettes calculation, the distance from each data point in a cluster to all another point within the same cluster and to all other data points in the closest cluster are determined. Thus silhouettes provide a measure of how well a data point was classified when it was assigned to a cluster and the separation between them. This feature renders silhouettes potentially well suited for assessing cluster quality in personal authentication methods. In this study, “silhouettes scores” was used for assessing the cluster quality of k-means clustering algorithm is well suited for comparing the performance of each EEG dataset. The main goals of this study are: (1) to represent each target as a tuple of multiple feature sets, (2) to assign a suitable measure to each feature set, (3) to combine different feature sets, (4) to determine the optimal feature weighting. Using precision/recall evaluations, the effectiveness of feature weighting in clustering was analyzed. EEG data from 22 subjects were collected. Results showed that: (1) It is possible to use fewer electrodes (3-4) for personal authentication. (2) There was the difference between each electrode for personal authentication (p<0.01). (3) There is no significant difference for authentication performance among feature sets (except feature PE). Conclusion: The combination of k-means clustering algorithm and silhouette approach proved to be an accurate method for personal authentication based on EEG signals.Keywords: personal authentication, K-mean clustering, electroencephalogram, EEG, silhouettes
Procedia PDF Downloads 28515190 Research Methodology and Mixed Methods (Qualitative and Quantitative) for Ph.D. Construction Management – Post-Disaster Reconstruction
Authors: Samuel Quashie
Abstract:
Ph.D. Construction Management methodology and mixed methods are organized to guide the researcher to assemble and assess data in the research activities. Construction management research is close to business management and social science research. It also contributes to researching the phenomenon and answering the research question, generating an integrated management system for post-disaster reconstruction in construction and related industries. Research methodology and methods drive the research to achieve the goal or goals, contribute to knowledge, or increase knowledge. This statement means the research methodology, mixed methods, aim, objectives, and processes address the research question, facilitate its achievement and foundation to conduct the study. Mixed methods use project-based case studies, interviews, observations, literature and archival document reviews, research questionnaires, and surveys, and evaluation of integrated systems used in the construction industry and related industries to address the research work. The research mixed methods (qualitative, quantitative) define the research topic and establish a more in-depth study. The research methodology is action research, which involves the collaboration of participants and service users to collect and evaluate data, studying the phenomenon, research question(s) to improve the situation in post-disaster reconstruction phase management.Keywords: methodology, Ph.D. research, post-disaster reconstruction, mixed-methods qualitative and quantitative
Procedia PDF Downloads 23115189 Recent Advances of Isolated Microspore Culture Response in Durum Wheat
Authors: Zelikha Labbani
Abstract:
Many biotechnology methods have been used in plant breeding programs. The in vitro isolated microspore culture is the one of these methods. For durum wheat, the use of this technology has been limited for a long time due to the low number of embryos produced and also most regeneration plants are albina. The objective of this paper is to show that using isolated microspores culture on durum wheat is possible due to the development of the new methods using the new pretreatment of the microspores before their isolation and cultivation.Keywords: isolated microspore culture, pretreatments, in vitro embryogenesis, plant breeding program
Procedia PDF Downloads 532