Search results for: mining methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15804

Search results for: mining methods

15144 Parameter Estimation of Induction Motors by PSO Algorithm

Authors: A. Mohammadi, S. Asghari, M. Aien, M. Rashidinejad

Abstract:

After emergent of alternative current networks and their popularity, asynchronous motors became more widespread than other kinds of industrial motors. In order to control and run these motors efficiently, an accurate estimation of motor parameters is needed. There are different methods to obtain these parameters such as rotor locked test, no load test, DC test, analytical methods, and so on. The most common drawback of these methods is their inaccuracy in estimation of some motor parameters. In order to remove this concern, a novel method for parameter estimation of induction motors using particle swarm optimization (PSO) algorithm is proposed. In the proposed method, transient state of motor is used for parameter estimation. Comparison of the simulation results purtuined to the PSO algorithm with other available methods justifies the effectiveness of the proposed method.

Keywords: induction motor, motor parameter estimation, PSO algorithm, analytical method

Procedia PDF Downloads 625
15143 Comparison between the Conventional Methods and PSO Based MPPT Algorithm for Photovoltaic Systems

Authors: Ramdan B. A. Koad, Ahmed F. Zobaa

Abstract:

Since the output characteristics of Photovoltaic (PV) system depends on the ambient temperature, solar radiation and load impedance, its maximum Power Point (MPP) is not constant. Under each condition PV module has a point at which it can produce its MPP. Therefore, a Maximum Power Point Tracking (MPPT) method is needed to uphold the PV panel operating at its MPP. This paper presents comparative study between the conventional MPPT methods used in (PV) system: Perturb and Observe (P&O), Incremental Conductance (IncCond), and Particle Swarm Optimization (PSO) algorithm for (MPPT) of (PV) system. To evaluate the study, the proposed PSO MPPT is implemented on a DC-DC converter and has been compared with P&O and INcond methods in terms of their tracking speed, accuracy and performance by using the Matlab tool Simulink. The simulation result shows that the proposed algorithm is simple, and is superior to the P&O and IncCond methods.

Keywords: photovoltaic systems, maximum power point tracking, perturb and observe method, incremental conductance, methods and practical swarm optimization algorithm

Procedia PDF Downloads 356
15142 Sixth-Order Two-Point Efficient Family of Super-Halley Type Methods

Authors: Ramandeep Behl, S. S. Motsa

Abstract:

The main focus of this manuscript is to provide a highly efficient two-point sixth-order family of super-Halley type methods that do not require any second-order derivative evaluation for obtaining simple roots of nonlinear equations, numerically. Each member of the proposed family requires two evaluations of the given function and two evaluations of the first-order derivative per iteration. By using Mathematica-9 with its high precision compatibility, a variety of concrete numerical experiments and relevant results are extensively treated to confirm t he t heoretical d evelopment. From their basins of attraction, it has been observed that the proposed methods have better stability and robustness as compared to the other sixth-order methods available in the literature.

Keywords: basins of attraction, nonlinear equations, simple roots, super-Halley

Procedia PDF Downloads 513
15141 A Study of Agile Based Approaches to Improve Software Quality

Authors: Gurmeet Kaur

Abstract:

Agile software development methods are being recognized as popular, and efficient approach to the development of software system that has a short delivery period with high quality also that meets customer requirements with zero defect. In agile software development, quality means quality of code where in the quality is maintained through the use of methods or approaches like refactoring, test driven development, behavior driven development, acceptance test driven development, and demand driven development. Software quality is measured in term of metrics such as the number of defects during development of software. Usage of above mentioned methods or approaches, reduces the possibilities of defects in developed software, and hence improve quality. This paper focuses on study of agile based quality methods or approaches for software development that ensures improved quality of software as well as reduced cost, and customer satisfaction.

Keywords: ATDD, BDD, DDD, TDD

Procedia PDF Downloads 161
15140 Consideration of Uncertainty in Engineering

Authors: A. Mohammadi, M. Moghimi, S. Mohammadi

Abstract:

Engineers need computational methods which could provide solutions less sensitive to the environmental effects, so the techniques should be used which take the uncertainty to account to control and minimize the risk associated with design and operation. In order to consider uncertainty in engineering problem, the optimization problem should be solved for a suitable range of the each uncertain input variable instead of just one estimated point. Using deterministic optimization problem, a large computational burden is required to consider every possible and probable combination of uncertain input variables. Several methods have been reported in the literature to deal with problems under uncertainty. In this paper, different methods presented and analyzed.

Keywords: uncertainty, Monte Carlo simulated, stochastic programming, scenario method

Procedia PDF Downloads 408
15139 Forensic Methods Used for the Verification of the Authenticity of Prints

Authors: Olivia Rybak-Karkosz

Abstract:

This paper aims to present the results of scientific research on methods of forging art prints and their elements, such as signature or provenance and forensic science methods that might be used to verify their authenticity. In the last decades, the art market has observed significant interest in purchasing prints. They are considered an economical alternative to paintings and a considerable investment. However, the authenticity of an art print is difficult to establish as similar visual effects might be achieved with drawings or xerox. The latter is easy to make using a home printer. They are then offered on flea markets or internet auctions as genuine prints. This probable ease of forgery and, at the same time, the difficulty of distinguishing art print techniques were the main reasons why this research was undertaken. A lack of scientific methods dedicated to disclosing a forgery encouraged the author to verify the possibility of using forensic science's methods known and used in other fields of expertise. This research methodology consisted of completing representative forgery samples collected in selected museums based in Poland and a few in Germany and Austria. That allowed the author to present a typology of methods used to forge art prints. Given that one of the most famous graphic design examples is bills and securities, it seems only appropriate to propose in print verification the usage of methods of detecting counterfeit currency. These methods contain an examination of ink, paper, and watermarks. On prints, additionally, signatures and imprints of stamps, etc., are forged as well. So the examination should be completed with handwriting examination and forensic sphragistics. The paper contains a stipulation to conduct a complex analysis of authenticity with the participation of an art restorer, art historian, and forensic expert as head of this team.

Keywords: art forgery, examination of an artwork, handwriting analysis, prints

Procedia PDF Downloads 121
15138 A Deep Learning Approach to Subsection Identification in Electronic Health Records

Authors: Nitin Shravan, Sudarsun Santhiappan, B. Sivaselvan

Abstract:

Subsection identification, in the context of Electronic Health Records (EHRs), is identifying the important sections for down-stream tasks like auto-coding. In this work, we classify the text present in EHRs according to their information, using machine learning and deep learning techniques. We initially describe briefly about the problem and formulate it as a text classification problem. Then, we discuss upon the methods from the literature. We try two approaches - traditional feature extraction based machine learning methods and deep learning methods. Through experiments on a private dataset, we establish that the deep learning methods perform better than the feature extraction based Machine Learning Models.

Keywords: deep learning, machine learning, semantic clinical classification, subsection identification, text classification

Procedia PDF Downloads 206
15137 Formulation of Corrector Methods from 3-Step Hybid Adams Type Methods for the Solution of First Order Ordinary Differential Equation

Authors: Y. A. Yahaya, Ahmad Tijjani Asabe

Abstract:

This paper focuses on the formulation of 3-step hybrid Adams type method for the solution of first order differential equation (ODE). The methods which was derived on both grid and off grid points using multistep collocation schemes and also evaluated at some points to produced Block Adams type method and Adams moulton method respectively. The method with the highest order was selected to serve as the corrector. The convergence was valid and efficient. The numerical experiments were carried out and reveal that hybrid Adams type methods performed better than the conventional Adams moulton method.

Keywords: adam-moulton type (amt), corrector method, off-grid, block method, convergence analysis

Procedia PDF Downloads 619
15136 Stating Best Commercialization Method: An Unanswered Question from Scholars and Practitioners

Authors: Saheed A. Gbadegeshin

Abstract:

Commercialization method is a means to make inventions available at the market for final consumption. It is described as an important tool for keeping business enterprises sustainable and improving national economic growth. Thus, there are several scholarly publications on it, either presenting or testing different methods for commercialization. However, young entrepreneurs, technologists and scientists would like to know the best method to commercialize their innovations. Then, this question arises: What is the best commercialization method? To answer the question, a systematic literature review was conducted, and practitioners were interviewed. The literary results revealed that there are many methods but new methods are needed to improve commercialization especially during these times of economic crisis and political uncertainty. Similarly, the empirical results showed there are several methods, but the best method is the one that reduces costs, reduces the risks associated with uncertainty, and improves customer participation and acceptability. Therefore, it was concluded that new commercialization method is essential for today's high technologies and a method was presented.

Keywords: commercialization method, technology, knowledge, intellectual property, innovation, invention

Procedia PDF Downloads 335
15135 Assessment of the Implementation of Recommended Teaching and Evaluation Methods of NCE Arabic Language Curriculum in Colleges of Education in North Western Nigeria

Authors: Hamzat Shittu Atunnise

Abstract:

This study on Assessment of the Implementation of Recommended Teaching and Evaluation Methods of the Nigeria Certificate in Education (NCE) Arabic Language Curriculum in Colleges of Education in North Western Nigeria was conducted with four objectives, four research questions and four null hypotheses. Descriptive survey design was used and the multistage sampling procedure adopted. Frequency count and percentage were used to answer research questions and chi-square was used to test all the null hypotheses at an Alpha 0.05 level of significance. Two hundred and ninety one subjects were drawn as sample. Questionnaires were used for data collection. The Context, Input, Process and Product (CIPP) model of evaluation was employed. The study findings indicated that: there were no significant difference in the perceptions of lecturers and students from Federal and State Colleges of Education on the following: extent of which lecturers employ appropriate methods in teaching the language and extent of which recommended evaluation methods are utilized for the implementation of Arabic Curriculum. Based on these findings, it was recommended among other things that: lecturers should adopt teaching methodologies that promote interactive learning; Governments should ensure that information and communication technology facilities are made available and usable in all Colleges of Education; Lecturers should vary their evaluation methods because other methods of evaluation can meet and surpass the level of learning and understanding which essay type questions are believed to create and that language labs should be used in teaching Arabic in Colleges of Education because comprehensive language learning is possible through both classroom and language lab teaching.

Keywords: assessment, arabic language, curriculum, methods of teaching, evaluation methods, NCE

Procedia PDF Downloads 51
15134 Comparing the Experimental Thermal Conductivity Results Using Transient Methods

Authors: Sofia Mylona, Dale Hume

Abstract:

The main scope of this work is to compare the experimental thermal conductivity results of fluids between devices using transient techniques. A range of different liquids within a range of viscosities was measured with two or more devices, and the results were compared between the different methods and the reference equations wherever it was available. The liquids selected are the most commonly used in academic or industrial laboratories to calibrate their thermal conductivity instruments having a variety of thermal conductivity, viscosity, and density. Three transient methods (Transient Hot Wire, Transient Plane Source, and Transient Line Source) were compared for the thermal conductivity measurements taken by using them. These methods have been chosen as the most accurate and because they all follow the same idea; as a function of the logarithm of time, the thermal conductivity is calculated from the slope of a plot of sensor temperature rise. For all measurements, the selected temperature range was at the atmospheric level from 10 to 40 ° C. Our results are coming with an agreement with the objections of several scientists over the reliability of the results of a few popular devices. The observation was surprising that the device used in many laboratories for fast measurements of liquid thermal conductivity display deviations of 500 percent which can be very poorly reproduced.

Keywords: accurate data, liquids, thermal conductivity, transient methods.

Procedia PDF Downloads 149
15133 Selection the Most Suitable Method for DNA Extraction from Muscle of Iran's Canned Tuna by Comparison of Different DNA Extraction Methods

Authors: Marjan Heidarzadeh

Abstract:

High quality and purity of DNA isolated from canned tuna is essential for species identification. In this study, the efficiency of five different methods for DNA extraction was compared. Method of national standard in Iran, the CTAB precipitation method, Wizard DNA Clean Up system, Nucleospin and GenomicPrep were employed. DNA was extracted from two different canned tuna in brine and oil of the same tuna species. Three samples of each type of product were analyzed with the different methods. The quantity and quality of DNA extracted was evaluated using the 260 nm absorbance and ratio A260/A280 by spectrophotometer picodrop. Results showed that the DNA extraction from canned tuna preserved in different liquid media could be optimized by employing a specific DNA extraction method in each case. Best results were obtained with CTAB method for canned tuna in oil and with Wizard method for canned tuna in brine.

Keywords: canned tuna PCR, DNA, DNA extraction methods, species identification

Procedia PDF Downloads 651
15132 Media Literacy: Information and Communication Technology Impact on Teaching and Learning Methods in Albanian Education System

Authors: Loreta Axhami

Abstract:

Media literacy in the digital age emerges not only as a set of skills to generate true knowledge and information but also as a pedagogy methodology, as a kind of educational philosophy. In addition to such innovations as information integration and communication technologies, media infrastructures, and web usage in the educational system, media literacy enables the change in the learning methods, pedagogy, teaching programs, and school curriculum itself. In this framework, this study focuses on ICT's impact on teaching and learning methods and the degree they are reflected in the Albanian education system. The study is based on a combination of quantitative and qualitative methods of scientific research. Referring to the study findings, it results that student’s limited access to the internet in school, focus on the hardcopy textbooks and the role of the teacher as the only or main source of knowledge and information are some of the main factors contributing to the implementation of authoritarian pedagogical methods in the Albanian education system. In these circumstances, the implementation of media literacy is recommended as an apt educational process for the 21st century, which requires a reconceptualization of textbooks as well as the application of modern teaching and learning methods by integrating information and communication technologies.

Keywords: authoritarian pedagogic model, education system, ICT, media literacy

Procedia PDF Downloads 132
15131 Manganese Contamination Exacerbates Reproductive Stress in a Suicidally-Breeding Marsupial

Authors: Ami Fadhillah Amir Abdul Nasir, Amanda C. Niehaus, Skye F. Cameron, Frank A. Von Hippel, John Postlethwait​, Robbie S. Wilson

Abstract:

For suicidal breeders, the physiological stresses and energetic costs of breeding are fatal. Environmental stressors such as pollution should compound these costs, yet suicidal breeding is so rare among mammals that this is unknown. Here, we explored the consequences of metal contamination to the health, aging and performance of endangered, suicidally-breeding northern quolls (Dasyurus hallucatus) living near an active manganese mine on Groote Eylandt, Northern Territory, Australia. We found respirable manganese dust at levels exceeding international recommendations even 20km from mining sites and substantial accumulation of manganese within quolls’ hair, testes, and in two brain regions—the neocortex and cerebellum, responsible for sensory perception and motor function, respectively. Though quolls did not differ in sprint speeds, motor skill, or manoeuvrability, those with higher accumulation of manganese crashed at lower speeds during manoeuvrability tests, indicating a potential effect on sight or cognition. Immune function and telomere length declined over the breeding season, as expected with ageing, but manganese contamination exacerbated immune declines and suppressed cortisol. Unexpectedly, male quolls with higher levels of manganese had longer telomeres, supporting evidence of unusual telomere dynamics among Dasyurids—though whether this affects their lifespan is unknown. We posit that sublethal contamination via pollution, mining, or urbanisation imposes physiological costs on wildlife that may diminish reproductive success or survival.

Keywords: ecotoxicology, heavy metal, manganese, telomere length, cortisol, locomotor

Procedia PDF Downloads 305
15130 Implementation Association Rule Method in Determining the Layout of Qita Supermarket as a Strategy in the Competitive Retail Industry in Indonesia

Authors: Dwipa Rizki Utama, Hanief Ibrahim

Abstract:

The development of industry retail in Indonesia is very fast, various strategy was undertaken to boost the customer satisfaction and the productivity purchases to boost the profit, one of which is implementing strategies layout. The purpose of this study is to determine the layout of Qita supermarket, a retail industry in Indonesia, in order to improve customer satisfaction and to maximize the rate of products’ sale as a whole, so as the infrequently purchased products will be purchased. This research uses a literature study method, and one of the data mining methods is association rule which applied in market basket analysis. Data were tested amounted 100 from 160 after pre-processing data, so then the distribution department and 26 departments corresponding to the data previous layout will be obtained. From those data, by the association rule method, customer behavior when purchasing items simultaneously can be studied, so then the layout of the supermarket based on customer behavior can be determined. Using the rapid miner software by the minimal support 25% and minimal confidence 30% showed that the 14th department purchased at the same time with department 10, 21st department purchased at the same time with department 13, 15th department purchased at the same time with department 12, 14th department purchased at the same time with department 12, and 10th department purchased at the same time with department 14. From those results, a better supermarket layout can be arranged than the previous layout.

Keywords: industry retail, strategy, association rule, supermarket

Procedia PDF Downloads 180
15129 Intelligent Recognition of Diabetes Disease via FCM Based Attribute Weighting

Authors: Kemal Polat

Abstract:

In this paper, an attribute weighting method called fuzzy C-means clustering based attribute weighting (FCMAW) for classification of Diabetes disease dataset has been used. The aims of this study are to reduce the variance within attributes of diabetes dataset and to improve the classification accuracy of classifier algorithm transforming from non-linear separable datasets to linearly separable datasets. Pima Indians Diabetes dataset has two classes including normal subjects (500 instances) and diabetes subjects (268 instances). Fuzzy C-means clustering is an improved version of K-means clustering method and is one of most used clustering methods in data mining and machine learning applications. In this study, as the first stage, fuzzy C-means clustering process has been used for finding the centers of attributes in Pima Indians diabetes dataset and then weighted the dataset according to the ratios of the means of attributes to centers of theirs. Secondly, after weighting process, the classifier algorithms including support vector machine (SVM) and k-NN (k- nearest neighbor) classifiers have been used for classifying weighted Pima Indians diabetes dataset. Experimental results show that the proposed attribute weighting method (FCMAW) has obtained very promising results in the classification of Pima Indians diabetes dataset.

Keywords: fuzzy C-means clustering, fuzzy C-means clustering based attribute weighting, Pima Indians diabetes, SVM

Procedia PDF Downloads 405
15128 Mining Scientific Literature to Discover Potential Research Data Sources: An Exploratory Study in the Field of Haemato-Oncology

Authors: A. Anastasiou, K. S. Tingay

Abstract:

Background: Discovering suitable datasets is an important part of health research, particularly for projects working with clinical data from patients organized in cohorts (cohort data), but with the proliferation of so many national and international initiatives, it is becoming increasingly difficult for research teams to locate real world datasets that are most relevant to their project objectives. We present a method for identifying healthcare institutes in the European Union (EU) which may hold haemato-oncology (HO) data. A key enabler of this research was the bibInsight platform, a scientometric data management and analysis system developed by the authors at Swansea University. Method: A PubMed search was conducted using HO clinical terms taken from previous work. The resulting XML file was processed using the bibInsight platform, linking affiliations to the Global Research Identifier Database (GRID). GRID is an international, standardized list of institutions, including the city and country in which the institution exists, as well as a category of the main business type, e.g., Academic, Healthcare, Government, Company. Countries were limited to the 28 current EU members, and institute type to 'Healthcare'. An article was considered valid if at least one author was affiliated with an EU-based healthcare institute. Results: The PubMed search produced 21,310 articles, consisting of 9,885 distinct affiliations with correspondence in GRID. Of these articles, 760 were from EU countries, and 390 of these were healthcare institutes. One affiliation was excluded as being a veterinary hospital. Two EU countries did not have any publications in our analysis dataset. The results were analysed by country and by individual healthcare institute. Networks both within the EU and internationally show institutional collaborations, which may suggest a willingness to share data for research purposes. Geographical mapping can ensure that data has broad population coverage. Collaborations with industry or government may exclude healthcare institutes that may have embargos or additional costs associated with data access. Conclusions: Data reuse is becoming increasingly important both for ensuring the validity of results, and economy of available resources. The ability to identify potential, specific data sources from over twenty thousand articles in less than an hour could assist in improving knowledge of, and access to, data sources. As our method has not yet specified if these healthcare institutes are holding data, or merely publishing on that topic, future work will involve text mining of data-specific concordant terms to identify numbers of participants, demographics, study methodologies, and sub-topics of interest.

Keywords: data reuse, data discovery, data linkage, journal articles, text mining

Procedia PDF Downloads 111
15127 Discovering the Effects of Meteorological Variables on the Air Quality of Bogota, Colombia, by Data Mining Techniques

Authors: Fabiana Franceschi, Martha Cobo, Manuel Figueredo

Abstract:

Bogotá, the capital of Colombia, is its largest city and one of the most polluted in Latin America due to the fast economic growth over the last ten years. Bogotá has been affected by high pollution events which led to the high concentration of PM10 and NO2, exceeding the local 24-hour legal limits (100 and 150 g/m3 each). The most important pollutants in the city are PM10 and PM2.5 (which are associated with respiratory and cardiovascular problems) and it is known that their concentrations in the atmosphere depend on the local meteorological factors. Therefore, it is necessary to establish a relationship between the meteorological variables and the concentrations of the atmospheric pollutants such as PM10, PM2.5, CO, SO2, NO2 and O3. This study aims to determine the interrelations between meteorological variables and air pollutants in Bogotá, using data mining techniques. Data from 13 monitoring stations were collected from the Bogotá Air Quality Monitoring Network within the period 2010-2015. The Principal Component Analysis (PCA) algorithm was applied to obtain primary relations between all the parameters, and afterwards, the K-means clustering technique was implemented to corroborate those relations found previously and to find patterns in the data. PCA was also used on a per shift basis (morning, afternoon, night and early morning) to validate possible variation of the previous trends and a per year basis to verify that the identified trends have remained throughout the study time. Results demonstrated that wind speed, wind direction, temperature, and NO2 are the most influencing factors on PM10 concentrations. Furthermore, it was confirmed that high humidity episodes increased PM2,5 levels. It was also found that there are direct proportional relationships between O3 levels and wind speed and radiation, while there is an inverse relationship between O3 levels and humidity. Concentrations of SO2 increases with the presence of PM10 and decreases with the wind speed and wind direction. They proved as well that there is a decreasing trend of pollutant concentrations over the last five years. Also, in rainy periods (March-June and September-December) some trends regarding precipitations were stronger. Results obtained with K-means demonstrated that it was possible to find patterns on the data, and they also showed similar conditions and data distribution among Carvajal, Tunal and Puente Aranda stations, and also between Parque Simon Bolivar and las Ferias. It was verified that the aforementioned trends prevailed during the study period by applying the same technique per year. It was concluded that PCA algorithm is useful to establish preliminary relationships among variables, and K-means clustering to find patterns in the data and understanding its distribution. The discovery of patterns in the data allows using these clusters as an input to an Artificial Neural Network prediction model.

Keywords: air pollution, air quality modelling, data mining, particulate matter

Procedia PDF Downloads 252
15126 Subjective Evaluation of Mathematical Morphology Edge Detection on Computed Tomography (CT) Images

Authors: Emhimed Saffor

Abstract:

In this paper, the problem of edge detection in digital images is considered. Three methods of edge detection based on mathematical morphology algorithm were applied on two sets (Brain and Chest) CT images. 3x3 filter for first method, 5x5 filter for second method and 7x7 filter for third method under MATLAB programming environment. The results of the above-mentioned methods are subjectively evaluated. The results show these methods are more efficient and satiable for medical images, and they can be used for different other applications.

Keywords: CT images, Matlab, medical images, edge detection

Procedia PDF Downloads 324
15125 The Challenges for Engineers to Change the Construction Method in Brazil

Authors: Yuri B. Cesarino, Vinícius R. Domingues, Darym J. F. Campos

Abstract:

Developing countries have some restrains towards the adoption of new technologies and construction methods. Some nations, such as Brazil, still use conventional construction methodologies, knowing its lesser cost-effectiveness. This research has been conducted to demonstrate how industrialized construction methods should be implemented in Brazil, especially in times of need. Using the common sense among different authors with different perspectives, it is clear that the second method is more suitable for construction development because of its great advantages. However, it is unlikely for this process to be adopted in the country as a result of several social-economic restraints. Nonetheless, Brazilian engineers have a major challenge ahead of them, and it will take more than creativity to solve such an issue.

Keywords: Brazilian engineers, construction methods, industrialized construction, infrastructure

Procedia PDF Downloads 274
15124 Coping Orientation of Academic Community in the Time of COVID-19 Pandemic: A Pilot Survey Study

Authors: Fereshteh Ahmadi, Önver Cetrez, Said Zandi, Sharareh Akhavan

Abstract:

In this paper, we have mapped the coping methods used to address the coronavirus pandemic by members of the academic community. We conducted an anonymous survey of a convenient sample of 674 faculty/staff members and students from September to December 2020. A modified version of the RCOPE scale was used for data collection. The results indicate that both religious and existential coping methods were used by respondents. The study also indicates that even though 71% of in-formants believed in God or another religious figure, 61% reported that they had tried to gain control of the situation directly without the help of God or another religious figure. The ranking of the coping strategies used indicates that the first five methods used by informants were all non-religious coping methods (i.e., secular existential coping methods): regarding life as a part of a greater whole, regarding nature as an important resource, listening to the sound of surrounding nature, being alone and con-templating, and walking/engaging in any activities outdoors giving a spiritual feeling. Our results contribute to the new area of research on academic community’s coping with pandemic-related stress and challenges.

Keywords: academic staff, academics, coping strategies, coronavirus epidemic, higher education.

Procedia PDF Downloads 78
15123 Multivariate Analysis on Water Quality Attributes Using Master-Slave Neural Network Model

Authors: A. Clementking, C. Jothi Venkateswaran

Abstract:

Mathematical and computational functionalities such as descriptive mining, optimization, and predictions are espoused to resolve natural resource planning. The water quality prediction and its attributes influence determinations are adopted optimization techniques. The water properties are tainted while merging water resource one with another. This work aimed to predict influencing water resource distribution connectivity in accordance to water quality and sediment using an innovative proposed master-slave neural network back-propagation model. The experiment results are arrived through collecting water quality attributes, computation of water quality index, design and development of neural network model to determine water quality and sediment, master–slave back propagation neural network back-propagation model to determine variations on water quality and sediment attributes between the water resources and the recommendation for connectivity. The homogeneous and parallel biochemical reactions are influences water quality and sediment while distributing water from one location to another. Therefore, an innovative master-slave neural network model [M (9:9:2)::S(9:9:2)] designed and developed to predict the attribute variations. The result of training dataset given as an input to master model and its maximum weights are assigned as an input to the slave model to predict the water quality. The developed master-slave model is predicted physicochemical attributes weight variations for 85 % to 90% of water quality as a target values.The sediment level variations also predicated from 0.01 to 0.05% of each water quality percentage. The model produced the significant variations on physiochemical attribute weights. According to the predicated experimental weight variation on training data set, effective recommendations are made to connect different resources.

Keywords: master-slave back propagation neural network model(MSBPNNM), water quality analysis, multivariate analysis, environmental mining

Procedia PDF Downloads 469
15122 A Qualitative Study into the Success and Challenges in Embedding Evidence-Based Research Methods in Operational Policing Interventions

Authors: Ahmed Kadry, Gwyn Dodd

Abstract:

There has been a growing call globally for police forces to embed evidence-based policing research methods into police interventions in order to better understand and evaluate their impact. This research study highlights the success and challenges that police forces may encounter when trying to embed evidence-based research methods within their organisation. 10 in-depth qualitative interviews were conducted with police officers and staff at Greater Manchester Police (GMP) who were tasked with integrating evidence-based research methods into their operational interventions. The findings of the study indicate that with adequate resources and individual expertise, evidence-based research methods can be applied to operational work, including the testing of initiatives with strict controls in order to fully evaluate the impact of an intervention. However, the findings also indicate that this may only be possible where an operational intervention is heavily resourced with police officers and staff who have a strong understanding of evidence-based policing research methods, attained for example through their own graduate studies. In addition, the findings reveal that ample planning time was needed to trial operational interventions that would require strict parameters for what would be tested and how it would be evaluated. In contrast, interviewees underscored that operational interventions with the need for a speedy implementation were less likely to have evidence-based research methods applied. The study contributes to the wider literature on evidence-based policing by providing considerations for police forces globally wishing to apply evidence-based research methods to more of their operational work in order to understand their impact. The study also provides considerations for academics who work closely with police forces in assisting them to embed evidence-based policing. This includes how academics can provide their expertise to police decision makers wanting to underpin their work through evidence-based research methods, such as providing guidance on how to evaluate the impact of their work with varying research methods that they may otherwise be unaware of.  

Keywords: evidence based policing, evidence-based practice, operational policing, organisational change

Procedia PDF Downloads 129
15121 Evaluation Methods for Question Decomposition Formalism

Authors: Aviv Yaniv, Ron Ben Arosh, Nadav Gasner, Michael Konviser, Arbel Yaniv

Abstract:

This paper introduces two methods for the evaluation of Question Decomposition Meaning Representation (QDMR) as predicted by sequence-to-sequence model and COPYNET parser for natural language questions processing, motivated by the fact that previous evaluation metrics used for this task do not take into account some characteristics of the representation, such as partial ordering structure. To this end, several heuristics to extract such partial dependencies are formulated, followed by the hereby proposed evaluation methods denoted as Proportional Graph Matcher (PGM) and Conversion to Normal String Representation (Nor-Str), designed to better capture the accuracy level of QDMR predictions. Experiments are conducted to demonstrate the efficacy of the proposed evaluation methods and show the added value suggested by one of them- the Nor-Str, for better distinguishing between high and low-quality QDMR when predicted by models such as COPYNET. This work represents an important step forward in the development of better evaluation methods for QDMR predictions, which will be critical for improving the accuracy and reliability of natural language question-answering systems.

Keywords: NLP, question answering, question decomposition meaning representation, QDMR evaluation metrics

Procedia PDF Downloads 70
15120 Presenting a Knowledge Mapping Model According to a Comparative Study on Applied Models and Approaches to Map Organizational Knowledge

Authors: Ahmad Aslizadeh, Farid Ghaderi

Abstract:

Mapping organizational knowledge is an innovative concept and useful instrument of representation, capturing and visualization of implicit and explicit knowledge. There are a diversity of methods, instruments and techniques presented by different researchers following mapping organizational knowledge to reach determined goals. Implicating of these methods, it is necessary to know their exigencies and conditions in which those can be used. Integrating identified methods of knowledge mapping and comparing them would help knowledge managers to select the appropriate methods. This research conducted to presenting a model and framework to map organizational knowledge. At first, knowledge maps, their applications and necessity are introduced because of extracting comparative framework and detection of their structure. At the next step techniques of researchers such as Eppler, Kim, Egbu, Tandukar and Ebner as knowledge mapping models are presented and surveyed. Finally, they compare and a superior model would be introduced.

Keywords: knowledge mapping, knowledge management, comparative study, business and management

Procedia PDF Downloads 396
15119 Optimal Relaxation Parameters for Obtaining Efficient Iterative Methods for the Solution of Electromagnetic Scattering Problems

Authors: Nadaniela Egidi, Pierluigi Maponi

Abstract:

The approximate solution of a time-harmonic electromagnetic scattering problem for inhomogeneous media is required in several application contexts, and its two-dimensional formulation is a Fredholm integral equation of the second kind. This integral equation provides a formulation for the direct scattering problem, but it has to be solved several times also in the numerical solution of the corresponding inverse scattering problem. The discretization of this Fredholm equation produces large and dense linear systems that are usually solved by iterative methods. In order to improve the efficiency of these iterative methods, we use the Symmetric SOR preconditioning, and we propose an algorithm for the evaluation of the associated relaxation parameter. We show the efficiency of the proposed algorithm by several numerical experiments, where we use two Krylov subspace methods, i.e., Bi-CGSTAB and GMRES.

Keywords: Fredholm integral equation, iterative method, preconditioning, scattering problem

Procedia PDF Downloads 97
15118 Orbit Determination Modeling with Graphical Demonstration

Authors: Assem M. F. Sallam, Ah. El-S. Makled

Abstract:

In this paper, there is an implementation, verification, and graphical demonstration of a software application, which can be used swiftly over different preliminary orbit determination methods. A passive orbit determination method is used in this study to determine the location of a satellite or a flying body. It is named a passive orbit determination because it depends on observation without the use of any aids (radio and laser) installed on satellite. In order to understand how these methods work and how their output is accurate when compared with available verification data, the built models help in knowing the different inputs used with each method. Output from the different orbit determination methods (Gibbs, Lambert, and Gauss) will be compared with each other and verified by the data obtained from Satellite Tool Kit (STK) application. A modified model including all of the orbit determination methods using the same input will be introduced to investigate different models output (orbital parameters) for the same input (azimuth, elevation, and time). Simulation software is implemented using MATLAB. A Graphical User Interface (GUI) application named OrDet is produced using the GUI of MATLAB. It includes all the available used inputs and it outputs the current Classical Orbital Elements (COE) of satellite under observation. Produced COE are then used to propagate for a complete revolution and plotted on a 3-D view. Modified model which uses an adapter to allow same input parameters, passes these parameters to the preliminary orbit determination methods under study. Result from all orbit determination methods yield exactly the same COE output, which shows the equality of concept in determination of satellite’s location, but with different numerical methods.

Keywords: orbit determination, STK, Matlab-GUI, satellite tracking

Procedia PDF Downloads 269
15117 PM Electrical Machines Diagnostic: Methods Selected

Authors: M. Barański

Abstract:

This paper presents a several diagnostic methods designed to electrical machines especially for permanent magnets (PM) machines. Those machines are commonly used in small wind and water systems and vehicles drives. Those methods are preferred by the author in periodic diagnostic of electrical machines. The special attention should be paid to diagnostic method of turn-to-turn insulation and vibrations. Both of those methods were created in Institute of Electrical Drives and Machines Komel. The vibration diagnostic method is the main thesis of author’s doctoral dissertation. This is method of determination the technical condition of PM electrical machine basing on its own signals is the subject of patent application No P.405669. Specific structural properties of machines excited by permanent magnets are used in this method - electromotive force (EMF) generated due to vibrations. There was analysed number of publications which describe vibration diagnostic methods and tests of electrical machines with permanent magnets and there was no method found to determine the technical condition of such machine basing on their own signals.

Keywords: electrical vehicle, generator, main insulation, permanent magnet, thermography, turn-to-traction drive, turn insulation, vibrations

Procedia PDF Downloads 392
15116 A Comparison of Bias Among Relaxed Divisor Methods Using 3 Bias Measurements

Authors: Sumachaya Harnsukworapanich, Tetsuo Ichimori

Abstract:

The apportionment method is used by many countries, to calculate the distribution of seats in political bodies. For example, this method is used in the United States (U.S.) to distribute house seats proportionally based on the population of the electoral district. Famous apportionment methods include the divisor methods called the Adams Method, Dean Method, Hill Method, Jefferson Method and Webster Method. Sometimes the results from the implementation of these divisor methods are unfair and include errors. Therefore, it is important to examine the optimization of this method by using a bias measurement to figure out precise and fair results. In this research we investigate the bias of divisor methods in the U.S. Houses of Representatives toward large and small states by applying the Stolarsky Mean Method. We compare the bias of the apportionment method by using two famous bias measurements: The Balinski and Young measurement and the Ernst measurement. Both measurements have a formula for large and small states. The Third measurement however, which was created by the researchers, did not factor in the element of large and small states into the formula. All three measurements are compared and the results show that our measurement produces similar results to the other two famous measurements.

Keywords: apportionment, bias, divisor, fair, measurement

Procedia PDF Downloads 361
15115 Quantum Cryptography: Classical Cryptography Algorithms’ Vulnerability State as Quantum Computing Advances

Authors: Tydra Preyear, Victor Clincy

Abstract:

Quantum computing presents many computational advantages over classical computing methods due to the utilization of quantum mechanics. The capability of this computing infrastructure poses threats to standard cryptographic systems such as RSA and AES, which are designed for classical computing environments. This paper discusses the impact that quantum computing has on cryptography, while focusing on the evolution from classical cryptographic concepts to quantum and post-quantum cryptographic concepts. Standard Cryptography is essential for securing data by utilizing encryption and decryption methods, and these methods face vulnerability problems due to the advancement of quantum computing. In order to counter these vulnerabilities, the methods that are proposed are quantum cryptography and post-quantum cryptography. Quantum cryptography uses principles such as the uncertainty principle and photon polarization in order to provide secure data transmission. In addition, the concept of Quantum key distribution is introduced to ensure more secure communication channels by distributing cryptographic keys. There is the emergence of post-quantum cryptography which is used for improving cryptographic algorithms in order to be more secure from attacks by classical and quantum computers. Throughout this exploration, the paper mentions the critical role of the advancement of cryptographic methods to keep data integrity and privacy safe from quantum computing concepts. Future research directions that would be discussed would be more effective cryptographic methods through the advancement of technology.

Keywords: quantum computing, quantum cryptography, cryptography, data integrity and privacy

Procedia PDF Downloads 11